-Experience with object-oriented/object function scripting languages: Python, C#, Scala, etc. -Work with data and analytics experts to strive for greater functionality in our data systems. -Assemble large, complex data sets that meet functional / non-functional business requirements. -Expert knowledge on different (NoSQL or RDBMS) databases such as HBase, Hive, Cassandra, MongoDB or Redis, Microsoft SQL Server, Elasticsearch -Big data scripting in PySpark -Should have understanding about using Anaconda, Pycharm, Microsoft Visual Studio -Understanding of MVC/MVT architecture -Experience in developing data migration, data ingestion services and REST APIs using NodeJS or Python Web framework -Experience in creating data lakes using Big Query, Elasticsearch, Kafka, HBASE, etc. -Exceptional analytical, quantitative, problem-solving, and critical thinking skills Familiarity with JavaScript (Node.js a plus) -A working understanding of code and script (Python, Django, Flask, PowerShell, Bash) -Should have experience development application full stack development -Good knowledge of OOPS, Design Patterns, Collection framework -Knowledge of parallel computing & hands on experience with MapReduce algorithms is a bonus -Experience working on AWS, GCP or Azure -Good to know Apache Airflow -Personal or professional projects on deep learning is a bonus -Sense of responsibility & ownership of given tasks -Strong communication skills -Self starter willing to learn things on the fly