*Role: Data EngineerLocation: Boston, MA – candidates need to relocate and
be present-day one OnsiteDuration: Long Term Mandatory Skills:*

   - API Development
   - Microservices Architecture
   - RESTAPIs

*Duties and responsibilities:*

   - Use big data technologies to develop distributed, fault-tolerant
   scalable data solutions.
   - Participate in discussions with customers, along with the product
   team, to understand their data requirements.
   - Translate the business requirements in to corresponding data
   requirements.
   - Collect and process data at scale from a variety of sources for
   different project needs.
   - Participate in identifying, evaluating, selecting, and integrating big
   data frameworks and tools required for the big data platform.
   - Design, develop, and maintain data pipelines , data platforms using
   selected frameworks and tools based on requirements from different projects.
   -  Convert structured and unstructured data into the form that is
   suitable for processing. Provide support to different teams in analysing
   data.
   - Design, develop and maintain data API’s.
   - Integrate data from variety of data sources using federation /
   virtualization techniques.
   - Develop solutions independently based on high-level design and
   architecture with minimal supervision.
   - Monitor the performance of the data platform on a regular basis and
   tune the infrastructure and platform components accordingly to ensure the
   best performance.
   - Maintain a high level of expertise in data technologies and stay
   current on latest data technologies.

*Qualifications*


   - Overall 10+ year experience in software design and development.
   - 5+ years of experience in data engineering.
   - Prior experience with implementing big data platform components that
   are scalable, high performing, and lower in operations cost.
   - Proven experience with integration of data from multiple heterogeneous
   and distributed data sources.
   - Experience with processing large amounts of data (structured and
   unstructured), building data models, data cleaning, data visualization and
   reporting.
   - Experience in production support and troubleshooting.
   - Hands-on knowledge of containers, API designing and implementing is a
   must.
   - Experience with NoSQL databases, Graph databases, relational
   databases, time series databases.
   - Excellent knowledge of various ETL techniques and frameworks, various
   messaging systems, stream-processing systems, Big data ML toolkits, big
   data querying tools
   - Experience in Python, Go, Perl, JavaScript, Kafka, Spark, Kubernetes.
   - Good knowledge of Agile software development methodology.
   - Excellent interpersonal, communication (verbal and written) skills.
   - Proven experience in managing and working with teams based in multiple
   geographies.
   - Bachelor’s Degree or higher in Computer Science or a related field.

Thanks

Ramesh

-- 
You received this message because you are subscribed to "rtc-linux".
Membership options at http://groups.google.com/group/rtc-linux .
Please read http://groups.google.com/group/rtc-linux/web/checklist
before submitting a driver.
--- 
You received this message because you are subscribed to the Google Groups 
"rtc-linux" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to rtc-linux+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/rtc-linux/CABzzWfNT9%3Dwh4D3t7g--SB%3DTw0gmx-Hxs9FQe10A-pprjhHspw%40mail.gmail.com.

Reply via email to