Deloitte is hiring forData Engineer (Last Date: Not Mentioned )
Deloitte Touche Tohmatsu Limited, commonly referred to as Deloitte, is an international professional services network headquartered in London, England. Deloitte is the largest professional services network by revenue and number of professionals in the world and is considered one of the Big Four accounting firms along with EY (Ernst & Young), KPMG and PricewaterhouseCoopers (PWC)
|Graduation Year||Graduated In 2023|
|Minimum Academic Percentage||Not Mentioned|
|Backlogs||Must Have 0 Backlogs|
Job Function & Salary
Job Role – Data Engineer – Consultant or Sr. Consultant
CTC – Not Mentioned
Work Location –Not Mentioned
Date of Joining – Not Mentioned
- Develop solutions with an Agile Development team.
- Define, produce, test, review, and debug solutions.
- Create component-based features and micro-frontends.
- Database development with Postgres
- Create comprehensive unit test coverage in all layers.
- Deploy solutions to Docker containers and Kubernetes.
- Help build a team culture of autonomy and ownership.
- Work with a Product Owner to refine stories into functional use cases and identify the work effort as tasks.
- Participate in test case creation responsibilities and peer reviews prior to coding.
- Review implementation plans of peers prior to their coding.
- Demonstrate feature work at the end of each iteration.
- Work from home when desired with infrequent visits to the office and limited travel for planning sessions.
- Develop our ETL process to be a robust automated production quality solution and lead the implementation and delivery.
- Peer with the application engineering team to ensure our data model fits the need of the solution while promoting best practices in its design from both a maintenance and performance perspective.
- Peer with the data science team in understanding their needs for preparing large datasets for machine learning.
- Assist the team with understanding the execution plan of poorly written queries. Help remediate performance problems by assisting the performance tuning of queries and/or refining the data model to meet the needs of the business.
- Build data systems and pipelines.
- Evaluate business needs and objectives.
- Explore ways to enhance Product / pipeline.
- Collaborate with Team
- Showcase the Skills / Innovative ideas to Team biweekly.
- Strong knowledge of Python & SQL.
- Hands-on experience with SQL database design
- Hands-on experience or Knowledge about Airflow.
- Knowledge of Docker and Kubernetes
- Experience with running containerized microservices.
- Experience with Apache Spark or AWS EMR.
- Experience with cloud platforms (AWS, Azure) with strong preference towards AWS.
- Experience in Database design practices
- Technical expertise with Data warehouse or Data Lake
- Expertise in configuring and maintaining PostgreSQL.
- Experience performance tuning queries and data models to produce the best execution plan.
- Strong experience building data pipelines & ETL.
- Experience working on an Agile Development team and delivering features incrementally.
- Experience with Git repositories
- Working knowledge of setting up builds and deployments
- Experience with both Windows and Linux.
- Experience demonstrating work to peers and stakeholders for acceptance
- Experience working with Azure DevOps, JIRA or similar project tracking software.
- Experience working in a startup environment
- Experience with data streaming such as Apache Kafka, AWS kinesis, Spark Streaming, or similar tools.
- Experience with many other big data technologies at scale.
- Experience with BigQuery or similar (Redshift, Snowflake, other MPP databases)
- Knowledge of when to use NOSQL versus traditional RDBMS
Last Date to register – Not Mentioned
Click Here For capgemini study material