Latest Job Opportunities in India
Discover top job listings and career opportunities across India. Stay updated with the latest openings in IT, government, and more.
Check Out Jobs!Read More
Intermediate Data Engineer in Sigma Program, Kiev, Harkiev, Lviv, Dnipro, Odessa, Venice, Ivano Franković, Luytsk, Ternopil, Circassian, Burgas (Bulgaria), Varshava (Puloa), Krakow (Puloa), Lisabon (Portugal), Poznań (Puloa), Vidalino
Kiev, Harkiv, Lviv, Dnipro, Odessa, Venice, Ivano-Frankovich, Lutsky, Ternopil, Circassian, Burgas (Polaria), Varshava (Puloa), Krakow (Puloa), Lisabon (Portugal), Poznań (Puloa), Vidalin
Join a globally recognized company where innovation meets impact. Our team of seasoned professionals and commitment to excellence enable us to deliver world-class solutions that shape industries and deliver meaningful results. Be part of a competency center where your expertise will be valued, your voice will be heard, and your career will be empowered.
client
is an international technology company specializing in the development of high-load data processing and analysis platforms. Its core product helps companies manage large amounts of data, build models, and gain actionable insights. The company operates globally, primarily serving clients in the marketing and advertising sectors, and focuses on cutting-edge technologies, microservices architecture, and cloud-based solutions.
Responsibilities
- Design, develop and maintain comprehensive, optimized and scalable big data pipelines capable of processing large amounts of data in real-time and in batch mode
- Follow and promote best practices and design principles for Data Lakehouse architecture
- Support technology decision making for the company’s future data management and analysis needs by conducting proofs of concept (POC).
- Writing and automating data pipelines
- Help improve data organization and accuracy
- Collaborate and work with data analysts, scientists and engineers to ensure best practices are used regarding data processing and storage technologies.
- Explore emerging technologies, stay on top of them, and proactively share lessons learned with the team
- Ensure that all deliverables adhere to our global standards
requirements
- More than 3 years of experience in big data development and database design
- Strong hands-on experience with SQL and experience working on advanced SQL
- Proficient in Python and other scripting languages
- Working knowledge of at least one of the big data technologies
- Experience developing software solutions using Hadoop technologies such as MapReduce, Hive, Spark,
Yarn/mesos, etc. - English level at least above intermediate
It would be a plus
- Experience with AWS cloud, S3, and Redshift
- Experience with Python
- Knowledge and exposure to business intelligence applications, such as Tableau and QlikView
Profile
- Excellent analytical and problem-solving skills
Join the leading innovation company to transform real people.
Our professionalism and delivery planning enables us to build quality decisions, reduce industry and produce outstanding results.
The central position is highly efficient to integrate your expertise into learning two goals and enhance potential brilliance.
Zamovnik
It is a leading technology company that specializes in creating high-quality platforms for its data analytics projects. This essential product effectively enhances business operations through very significant contributions, construction models and practical analysis. The company’s operations are spread around the world, serving clients in marketing and advertising, with a focus on happy technologies, microserver architecture and intelligent solutions.
amazing
- Projects and projects and encourage complex innovations that are improved and scaled up Data problems are great questions in the real clock system and in the package system
- Improve and implement best practices and design principles of Data Lakehouse architectures
- Kids enjoy biotechnology for favorite businesses in money management, analytics and POC (Proof of Concept) providers
- Fingerprinting and automatic transfers
- Additions complement the present-day structure and exquisite detailing
- Analytics practices, data systems, and tools to improve these robotics best practices are
- Demonstrates and supports new technologies and proactively drives operations
- Make sure all standard robots results are provided
Animated images
- 3+ additional access methods to a wide range of data and production essentials
- The Practical Source for SQL (Easy to Draw and Optimize)
- Create Python and other scripts
- A practical resource related to only one technology
- Many diverse programs have been developed using Hadoop technology, such as MapReduce, Hive, Spark,
- Yarn/Messus as well
- Learn English from upper intermediate to upper level
The former would be
- Availability of AWS bots (such as S3 and Redshift)
- Preferred source for Python
- Learn about bots using binary tools (e.g., Tableau, QlikView)
Profile
- The problem of analytical debt and new data
https://jobs.dou.ua/companies/sigma-software/vacancies/328013/?utm_source=jobsrss



