*Hi* *Immediate Need!*
*Let me know if you have any candidate available for the below job position we have, Thanks!* Please send resumes to *moh...@datagrp.com <moh...@datagrp.com>*; Direct: 201 308 –8704 *Big Data Solutions Architect * *Jersey City NJ* *6-8 Months* *Implement partner: Spectra Group* *Rate: D.O.E* *Senior Application Developer – Architecture* · Participate in the design and documentation of strategic software development initiatives for the enterprise architecture group. · Develop highly scalable applications in using Spark and Kafka and Hive · Have good knowledge of Microservices & eventing architecture and use it to develop state art components that scale using Cloud, Big Data, Docker as building blocks · Familiarity with Parquet, Avro, Hive context is important to influence the design of new components · Interface with internal teams who are both technical users and end users of the relevant systems · Interface with RTB / production support teams triage and fix relevant production issues including communicating with up and downstream users. *Basic Qualifications *(Legislative Requirement) · Advanced degree in computer science or related field · Expert level of knowledge in enterprise design patterns, best practices and various SDLC (Waterfall, Scrum, Kanban). · 8+ years professional programming experience · 3+ experience with various Hadoop ecosystem components, 2+ years hands on experience in Spark (Spark core/SQL). · Hands on experience in real time data processing using Spark and Kafka and Hive is a must have for this role · Knowledge of Python, Java, Java Script, Scala is good to have · Experience with development tools and frameworks like Spring, Git, Jenkins, Maven and Nexus. · Strong desire and proven ability to tackle and troubleshoot challenging technical problems and ability do so with little or no direct daily supervision. · *Other Required Qualifications:* · Demonstrated track record of bring software from inspection to live production environments · Experience with a wide array of standard technologies including databases messaging and big data solutions · Working knowledge of Market Risk data entities in the context of large regulatory projects · Good understanding of derivatives with cross asset class experience · Experience with real time data pipeline and architecture. · Experience working with NoSQL databases or in-memory databases. · Experience with Cloud, Docker. Job Description : PURPOSE - Manage a global team of software developers to design, develop and deliver software for needs of the Enterprise Architecture Department - Strategic core components for delivery of market data and risk data to meet regulatory reports & other strategic initiatives *PRINCIPAL RESPONSIBILITIES **[Include a percentage estimate of time spent on a responsibility at the end of each statement and designate those responsibilities deemed to be essential functions of this particular job with an asterisk (*).] * - Participate in the design and documentation of strategic software development initiatives for the enterprise architecture group. - Manage the daily tasks of developers in strategic software initiatives with a focus on GUI Development (Python Django) - Have good knowledge of Microservices & eventing architecture and use it to develop state art components that scale using Cloud, Big Data, Docker as building blocks - Familiarity with Parquet, Avro, Hive context is important to influence the design of new components - Track and report status and make long range plans for development work and software needs - Oversee and participate in the software release procedures - Interface with internal teams who are both technical users and end users of the relevant systems - Interface with RTB / production support teams triage and fix relevant production issues including communicating with up and downstream users. - Ensure that employees understand RBC vision, as well as support and reinforce targeted behaviors that contribute to RBC goals. - Provide focus and clarity in establishing individual goals, driving performance management, supporting career development and rewarding strong performance. - Leverage the value in unit, department, and enterprise wide teams to develop better solutions and achieve a cross enterprise mindset. - Accept and successfully execute change while supporting employees through the process, and keeping them focused on business priorities. - Understand and exemplify our people driven, client focused leadership model; driving impact, unlocking the potential of our people, adapt quickly, always learning and speaking up for the good of RBC. - Contribute to an exceptional employee experience through strong people leadership practices that amplify our Collective Ambition and exemplify the Leadership Model to inspire, support and empower employees to achieve their potential. - Growth mindset and a focus on always learning; promotes a values based and inclusive workplace; embraces diverse styles and perspectives; Actively mitigates bias and improves diversity representation - Provide data and interfaces to data for FO analytics systems and users. *JOB SPECIFICATIONS AND QUALIFICATIONS * Basic Qualifications - Advanced degree in computer science or related field - Expert level of knowledge in enterprise design patterns, best practices and various SDLC (Waterfall, Scrum, Kanban). - 8+ years professional programming experience - 3+ experience with various Hadoop ecosystem components, 2+ years hands on experience in Spark (Spark core/SQL). - Hands on experience in real time data processing using Spark and Kafka and Hive is a must have for this role - Knowledge of Python, Java, Java Script, Scala is good to have - Experience with development tools and frameworks like Spring, Git, Jenkins, Maven and Nexus. - Strong desire and proven ability to tackle and troubleshoot challenging technical problems and ability do so with little or no direct daily supervision. *Other Required Qualifications:* - Demonstrated track record of bring software from inspection to live production environments - Experience leading teams of software developers using Agile methodologies - Experience with a wide array of standard technologies including databases messaging and big data solutions - Working knowledge of Market Risk data entities in the context of large regulatory projects - Good understanding of derivatives with cross asset class experience - Experience with real time data pipeline and architecture. - Experience working with NoSQL databases or in-memory databases. - Experience with Cloud, Docker. - In memory aggregation technologies such as DRUID, Active Pivot etc. *Suggested Qualifications:* Knowledge of a variety of high level programming languages including Python, Java, Java Script, HTML & CSS, Scala, Spark, Hive, HDFS, In memory aggregation technologies such as DRUID, Active Pivot etc. -- You received this message because you are subscribed to the Google Groups "VB.NET 2003 Group" group. To unsubscribe from this group and stop receiving emails from it, send an email to vbnet2003group+unsubscr...@googlegroups.com. To post to this group, send email to vbnet2003group@googlegroups.com. Visit this group at https://groups.google.com/group/vbnet2003group. For more options, visit https://groups.google.com/d/optout.