Hello, Hope you are doing good.
We have an immediate opening for the below position, kindly let me know your interest with your updated resume at *bhara...@accurogroup.com <bhara...@accurogroup.com>*. *Data modeler (2) / Data framework engineer (4) / Data platform engineer (1) : Remote / CA* *Location : Remote / CA* *Contract* *Data framework engineer * In this role, you will be responsible for expanding and evolving the data pipeline framework of a cloud based data platform. The ideal candidate will have experience moving and processing large volume of data across on premise and cloud environments in a secure manner. The framework engineer will support software developers, data architects, data analysts and business users to get the most value out of the platform with minimal friction. The candidate must be self-directed with ability to own and execute the platform improvement activities while collaborating with other team members and stakeholders. 8-10 years of Python development experience including setting up development environments, using code repository such as BitBucket, requesting and performing pull requests, configuring Python libraries and more required. 8-10 years of experience in building and maintaining data platforms, ETL processes and connected sources and targets required 8-10 years of experience in architecture, designing and operationalizing data lakes, data warehouses and data marts, data layering, data virtualization/physicalization, data normalization/denormalization, data storage and movement patterns, data supply chains, and data catalogs 5-7 years of hands on experience in deploying and running data pipelines on Amazon Web Services (AWS) required 5-7 years of experience with reference data management, metadata management, party data management, and source and target management. 5-7 years of experience generating extracts from the data warehouse for downstream systems, putting data securely in the hands of the business, and configuring and optimizing fine grained entitlements (role based and attribute based security). 5-7 years of hands on experience using Hive, AWS S3, AWS step functions, and cloud watch. 3-5 years of performance measurement and tuning experience preferred Experience in the financial services or banking industry required Experience implementing enterprise systems with security best practices and site reliability engineering principles. Proficient in Structured Query Language (SQL) Deep understanding of the design and development considerations around data partitioning, job scheduling, data versioning, data import/export, archival, and schema management Deep understanding of data partitioning and its impact on query performance. Strong understanding of data read and write performance and impact of data storage and movement patterns on the performance Strong foundation in the following AWS Cloud Services: S3, CloudWatch, EMR (Hive), Glue and Glue catalog, Redshift Spectrum, EC2, StepFunctions Experience creating data warehouses that process large volume of data in a specific time window. Experience merging, separating, and sunsetting data warehouses. Experience in developing and automating data pipeline and deployment in ETL tools, SQL, Python, Spark, or related framework. Experience supporting non-production data environments to support testing of various functional and non-functional usages and requirements. Must have system design thinking, ability to plan and author technical documents. Require problem solving, and prioritization skills. Need ability to articulate new ideas and explain them to all level in the organization. Study the current state implementation of the data lake, data warehouse, and data marts; recommend a future state and a path to migration that is least impactful to the business and the current production workloads. Implement the best practices and design patterns for the data lake, enterprise data warehouse, and domain specific data marts. Enable data self-service through a published catalog of data. Implement fine grained entitlements and operational reporting of the entitlements to ensure data is only accessed by the right person with the right role on a need to know basis. Help implement encryption, obfuscation and other security best practices to meet all organizational data security policies and guidelines. Design, build, automate data pipeline, testing systems, event monitoring and notifications *Data platform engineer * In this role you will collaborate with senior technical team members and developers to expand and evolve the cloud based enterprise data platform. You will act as a technical resource for AWS cloud data lake and enterprise data platform. You will design, configure and help operate the infrastructure. The ideal candidate will have experience with platforms that source data from multiple different sources, process and standardize the data and make data available securely to the business users using cloud. The candidate will triage and troubleshoot issues related to the AWS services and help optimize the configuration and use of those services. The candidate must be self-directed with ability to own and execute the platform improvement activities while collaborating with other team members and stakeholders. 10-12 years of experience in cloud engineering required 8-10 years of experience in deploying and running data pipelines on Amazon Web Services (AWS) required 8-10 years of experience in a hands-on mix of technical engineering, architecture, and/or development roles Experience in the financial services or banking industry preferred 6-8 years of experience moving code and data through non-production and production environments required 6-8 years of experience working in a large IT environment in support of IT service and business processes, with proven track record of driving large-scale changes 4-6 years of experience in installing, patching, updating, upgrading, and sunsetting libraries, software, and hardware required 4-6 years of experience processing data in Hadoop clusters required Experience implementing enterprise systems with security best practices and site reliability engineering principles. Expert knowledge of AWS Cloud Data Lake/Data Platform and associated frameworks, adapters/connectors, and orchestration components Expert knowledge of the following AWS services: S3, CloudWatch, EMR (Hive), Glue and Glue catalog, Redshift Spectrum, EC2, StepFunctions, AWS cloud formation Experience implementing cloud security solutions including role-based access, access management, identity and directory infrastructure, and AWS Lake Formation Expert knowledge of multiple IT control and project management practices and experience working across large environments Advanced working knowledge of public cloud based solutions and application program interfaces (APIs). Python, Big Data, Hadoop, Tuning and optimization, operations, event management, performance AWS certification Demonstrated leadership capabilities required Expertise working experience with highly segmented networks in public and private clouds Deep understanding of infrastructure automation, instrumentation, and full stack technologies SQL, IDEs, Unix, JIRA, Confluence, AWS and AWS services, performance management and monitoring tools Experience in improving an existing data platform without affecting the current production workloads *Bharat Chhibber | Sr. Technical Recruiter* *Direct: 919 626 9615 | EMAIL bhara...@accurogroup.com <bhara...@accurogroup.com>* -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To unsubscribe from this group and stop receiving emails from it, send an email to android-developers+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/android-developers/CAEmgVe3piMsKeCQn-h4n%2Bo1c8eaD2kv%2BhSj%3DtKQJMrOOm3JPWQ%40mail.gmail.com.