company logo
Actively hiring 1d ago

No Sign Up Required!

Job Description

Key Responsibilities

1. Design, build, and maintain scalable and reliable data pipelines.

2. Integrate data from various sources, including APIs, relational databases, and NoSQL databases.

3. Ensure data availability and quality across data warehouse, data lake, and lakehouse environments.

4. Optimize ETL/ELT processes and data pipelines for maximum performance and efficiency.

5. Ensure data replication processes between systems or environments run correctly and efficiently.

6. Collaborate with data analysts and data scientists to deliver clean, usable data and actionable insights.

7. Communicate effectively with principals and external stakeholders in English, particularly for system integration, troubleshooting, and technical documentation.

8. Maintain data security and ensure compliance with company policies and standards. 

Requirements

Qualifications

1. Proven experience as a Data Engineer, Data Analyst, or in a related role.

2. Proficient in both spoken and written English (required for communication with international partners).

3. Strong understanding of APIs (REST, SOAP, or GraphQL).

4. Proficient in SQL and at least one programming language, preferably Python.

5. Solid knowledge of relational databases (e.g., PostgreSQL, MySQL) and/or NoSQL databases (e.g., MongoDB, Redis).

6. Familiar with Big Data concepts and tools such as Hadoop, Spark, or Kafka.

7. Experience with cloud platforms (AWS, GCP, Azure), particularly in data-related services such as BigQuery, Redshift, or Snowflake.

8. Understanding of data warehouse, data lake, and lakehouse architectures.

9. Experience managing or monitoring data replication between systems or environments.

10. Capable of performing data maintenance, optimization, and profiling.

11. Experience with data integration tools (e.g., IBM, Informatica, Qlik Replicate, etc).

Additional Value

1. Experience with data workflow orchestration tools (e.g., Airflow, etc).

2. Familiarity with CI/CD practices for data pipelines.

3. Knowledge of data governance and data quality best practices.

Job summarySeeking a Data Engineer to build data pipelines, ensure data quality, and support integration. SQL, Python & cloud experience required.

💙

Tips Menjaga Diri

Perusahaan dan Lowongan di Dealls tidak meminta data pribadi, informasi rekening, atau pungutan ketika melamar. Hindari juga lowongan Google Form / Grup Telegram tanpa keabsahan yang jelas.

About PT.Japfa Comfeed Indonesia
Learn More

PT Japfa Comfeed Indonesia, Tbk adalah salah satu perusahaan agri-food terbesar dan terkemuka di tanah air. Kami adalah penghasil protein hewani berkualitas dan terpercaya, yang dengan setia melayani kebutuhan serta menjadi kebanggaan Indonesia sejak tahun 1975.

Industry
Agri-Food
Location
Jakarta Selatan, Indonesia
Company Size
>100 employees

Culture

Collaborative
Here, we work together to make the dream work
Quality
We put great emphasis on preserving the high existing standard of our products/services
Integrity
We uphold honesty and virtue in every single product/service we deliver