Hi,
Pleasure mailing you. I came across your profile in job portal and wanted to touch base with your regarding one of our Global Implementation partner/Client opportunity. Please go through the below requirement and let me know if you are comfortable for the position. Please send me your updated resume along with the best hourly rate, visa status and availability. An early response is really appreciated. *Role: AWS, S3 Developer with SFDC Wave Analytics* *Location: Atlanta , GA* *Duration: Long Term* *Must: Candidate should have both AWS , S3 development along with SFDC Wave Analytics (MUST)* *Description* As a Data & Analytics Engineer supporting Cox Automotive Enterprise Platforms’ Lead to Cash (L2C) transformation, and as an Agile team member, you will be responsible for the delivery of strategic analytics data solutions. This role, in partnership with counterpart technology teams, is accountable for the design, development, quality, support, and adoption of production grade data and analytics solutions. A successful Engineer is one who thrives as a collaborative member of a small team within a “start-up like” environment: she/he may wear many hats, be asked to solve problems beyond her/his current technical knowledge, and is resourceful in ensuring delivery commitments are met. *Technology Stack: MuleSoft (data movement), AWS (data processing and data repository), and SFDC Wave Analytics (visualization and presentation). * *Services leveraged within AWS are S3, EMR (Spark, Scala), EC2 (Bash Scripting), Data Pipeline, and Redshift.* · *Responsibilities* · In partnership with Product Owner and Agile team members, deliver analytics solutions, including collecting data from providers, building transformations and integrations, persisting within repositories, and distributing to consuming systems · Working primarily within AWS, deliver event-driven, data processing pipelines, and ensure data sets are captured, designed, and housed effectively (consistently, optimized for cost, ease of support and maintenance). · Transition MVP solutions into operationally hardened systems, including introducing re-useable objects and patterns to drive automation, maintainability and supportability. · Participate in backlog refinement and request decomposition, including data discovery and data analysis · Proactively identify, communicate, and resolve project issues and risks that interfere with project execution · Self-directed problem solving: research, self-learn, and collaborate with peers to drive technical solutions · Rapid response and cross-functional work to resolve technical, procedural, and operational issues *Qualifications* · A minimum of 5 years of experience delivering analytics, reporting or business intelligence solutions · A minimum of 3 years of experience developing in big data technologies (Hadoop, NoSQL, AWS) · Proficient in SQL and at least one of these programming language: Java, Scala, Python · Experience designing event-driven, data processing pipelines · At ease developing within both databases and file systems via CLI · Strong, hands-on technical skills and self-directed problem solving · MUST: Experience with Mulesoft, SFDC Sales Cloud CRM, SFDC Wave Analytics · Desired: Experience with data modeling (normalization, slowly changing, star, data vault) · Desired: Experience with MMP databases (Teradata, Exadata, Netezza, Redshift) · Desired: Experience in working on Agile teams · Desired: Experience with Lean software development · Preferred: Experience developing in Spark (Spark Streaming, Dataframes, Datasets) · MUST : Experience developing within AWS, especially S3, EMR, Data Pipeline, and Redshift Regards, Raju Mandal Phone:408-338-6070 *[email protected] <[email protected]>* *WinWire Technologies | **www.winwire.com* <http://www.winwire.com/> -- You received this message because you are subscribed to the Google Groups "mainframe" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/mainframe. For more options, visit https://groups.google.com/d/optout.
