Hello,
*Greetings!* This is *Avanish* from *NLB Services*. We are a global recruitment company with a specialization in hiring IT professionals. One of our clients is looking for a *Data Engineer / Data Scientist *in *Atlanta, GA (Hybrid)**.* *Position: **Data Engineer / Data Scientist* *Location: **Atlanta, GA (Hybrid)* *Type: Contract* *Job Description:* *Note:* - *Recent experience in the Payment Domain is required.* - *Target candidates from TYSS, InComm Payments, and Global Payments or similar organizations.* - *Strong ability to analyze enterprise payment data, identify patterns, and optimize processes.* - *Expertise in deep diving into complex datasets to derive actionable business insights.* - *Candidates with a Banking and Financial Services background will be a strong fit.* *Responsibilities:* - Design, build, and optimize scalable data pipelines to support business operations and analytics. - Work extensively with AWS technologies such as S3, Lambda, Aurora DB, Glue, Kinesis, and Redshift for data processing and storage. - Manage and enhance data workflows to streamline data ingestion, transformation, and storage. - Develop and maintain ETL/ELT processes to ensure data accuracy, consistency, and security. - Work with Enterprise Payment Systems to ensure seamless data integration and transaction processing. - Design and optimize Aurora DB to support high-performance transactional workloads. - Implement real-time data streaming solutions using AWS Kinesis or Kafka. - Conduct deep-dive analysis on large datasets to drive strategic business insights. - Collaborate with Data Scientists, Analysts, and Business Teams to develop data-driven solutions. - Ensure data governance, security, and compliance best practices. - Automate and optimize data operations for improved efficiency and scalability. *Required Skills:* - Bachelor’s/Master’s degree in Computer Science, Data Engineering, or a related field. - 3+ years of hands-on experience in data engineering or related fields. - Strong experience with AWS services (S3, Lambda, Glue, Aurora DB, Kinesis, Redshift, EMR, etc.). - Proficiency in SQL, Python, or Scala for data manipulation and pipeline development. - Experience working with relational and NoSQL databases, particularly AWS Aurora DB, PostgreSQL, or DynamoDB. - Expertise in building and managing data pipelines for batch and real-time data processing. - Hands-on experience with big data frameworks such as Spark, Hadoop, or Flink. - Experience in Enterprise Payment Systems and handling large financial datasets is a plus. - Strong analytical mindset with the ability to deep dive into data and generate meaningful insights. - Proficiency in working with data modeling, data warehousing, and data architecture best practices. - Familiarity with CI/CD pipelines, Infrastructure as Code (IaC), and DevOps practices for data workflows. - Excellent problem-solving skills and ability to work in a fast-paced environment. *Preferred:* - Certifications in AWS Data Engineering, Big Data, or Machine Learning are a plus. *Thanks & Regards,* *Avanish Pandey* *---------------------------------------* *Next Level Business Services, Inc.* *avanish.pan...@recruiter.nlbtech.com* <avanish.pan...@recruiter.nlbtech.com> *(904) 290-8616 || **LinkedIn* <https://www.linkedin.com/in/avanish-pandey-83897493/> -- You received this message because you are subscribed to "rtc-linux". Membership options at http://groups.google.com/group/rtc-linux . Please read http://groups.google.com/group/rtc-linux/web/checklist before submitting a driver. --- You received this message because you are subscribed to the Google Groups "rtc-linux" group. To unsubscribe from this group and stop receiving emails from it, send an email to rtc-linux+unsubscr...@googlegroups.com. To view this discussion visit https://groups.google.com/d/msgid/rtc-linux/CAM%3D5NfXs3EAm%3Dj%2BNdAb0d2Zm9C6AY5%2BgoCN3L7W5Sr5EuMZoBA%40mail.gmail.com.