Cognizant

Data Engineer

Posted 25 September 2024
LocationSingapore
Reference00060724881

Company's Benefits

  • Flexible Working Arrangements

    Flexible Working Arrangements

  • Mentorship Program

    Mentorship Program

  • Leadership Development Program

    Leadership Development Program

  • Paid Parental Leave

    Paid Parental Leave

  • Return to Work Policy

    Return to Work Policy

  • Sponsorship Program

    Sponsorship Program

  • Raise Numbers Of Women In Leadership

    Raise Numbers Of Women In Leadership

  • Internal Women's Networking Group

    Internal Women's Networking Group

Job Description

Job Responsibilities

  • Design, develop and deploy data tables, views and marts in data warehouses, operational data store, data lake and data virtualization.

  • Perform data extraction, cleaning, transformation, and flow. Web scraping may be also a part of the work scope in data extraction.

  • Design, build, launch and maintain efficient and reliable large-scale batch and real-time data pipelines with data processing frameworks

  • Integrate and collate data silos in a manner which is both scalable and compliant

  • Collaborate with Project Manager, Data Architect, Business Analysts, Frontend Developers, UX Designers and Data Analyst to build scalable data-driven products

  • Be responsible for developing backend APIs & working on databases to support the applications

  • Work in an Agile Environment that practices Continuous Integration and Delivery

  • Work closely with fellow developers through pair programming and code review process

Job Requirements

  • Proficient in general data cleaning and transformation (e.g. SQL, pandas, R, etc)

  • Proficient in building ETL pipeline (eg. SQL Server Integration Services (SSIS), AWS Database Migration Services (DMS), Python, AWS Lambda, ECS Container task, Eventbridge, AWS Glue, Spring)

  • Proficient in database design and various databases (e.g. SQL, PostgreSQL, AWS S3, Athena, MongoDB, Postgres/gis, MYSQL, SQLite, voltdb, Cassandra, etc)

  • Experience in cloud technologies such as GPC, GCC (i.e. AWS, Azure, Google Cloud)

  • Experience and passion for data engineering in a big data environment using Cloud platforms such as GPC, GCC (i.e. AWS, Azure, Google Cloud)

  • Experience with building production-grade data pipelines, ETL/ELT data integration

  • Knowledge about system design, data structure and algorithms

  • Familiar with data modelling, data access, and data storage infrastructure like Data Mart, Data Lake, Data Virtualization and Data Warehouse.

  • Familiar with rest API and web requests/protocols in general

  • Familiar with big data frameworks and tools (eg. Hadoop, Spark, Kafka, RabbitMQ)

  • Familiar with W3C Document Object Model and customized web scraping (e.g. BeautifulSoup, CasperJS, PhantomJS, Selenium, Nodejs, etc)

  • Comfortable in at least one scripting language (eg. SQL, Python)

  • Comfortable in both windows and Linux development environments Interest in being the bridge between engineering and analytics.