Hello,

Hope you are doing good.

We have an immediate opening for the below position, kindly let me know
your interest with your updated resume at *bhara...@accurogroup.com
<bhara...@accurogroup.com>*.



*Through TCS/..*

*Role-1-*

*Bloomington, IL*

*Rate-$60/hr c2c*

*Job Title*

*Oracle Insurance Policy Administration (OIPA) Developer*

*Relevant Experience*

*(in Yrs)*

8+ Years

*Technical/Functional Skills*

   - A strong understanding of SQL and general database concepts, in terms
   of query building and optimization
   - Good understanding of the various Life insurance and/or annuity
   products
   - Configure business processes based on defined business requirements
   - Strong knowledge on Agile methodology
   - Configure product features (such as calculations) based on product
   specifications
   - Load product rates
   - Define product parameters
   - Configure access and security roles
   - Use OIPA built in functionality where possible to meet business
   requirements
   - Work with the technical team to ensure integration points are
   supported through configuration
   - Work with Quality Assurance team to troubleshoot and resolve issues
   - Configuration Development Rule studio OIPA System
   - Understanding of the MySupport process with Oracle for reporting
   defects

*Experience Required*

   - Actuarial Science degree or background.
   - Annuity and/or life insurance product background.
   - Experience with OIPA configuration.
   - Experience with Rule studio
   - Experience with Rules palette

*Roles & Responsibilities*

We're looking for OIPA developer, would be working with (Agile) teams in
projects together with professional colleagues.

   - Must understand the requirement from business partner and translate it
   for the team.
   - Comfortable leading teams through change when consuming latest
   technologies or development practices.
   - Strong soft skills to help bring organizational changes.
   - Expected to demo solution to team for aiding their understanding of
   future technical challenges.

*Generic Managerial Skills*

   - Excellent written/verbal communication skills
   - Excellent team player
   - Consensus building skills
   - Independently manage Stakeholder Meetings
   - Detail-oriented
   - Excellent visual, written and verbal communication skills, along with
   presentation and negotiation skills
   - The ability to effectively take direction and work both
   collaboratively and autonomously
   - Strong problem-solving skills
   - String desire to provide new solutions, which are cost effective and
   deliver quicker Time to market.

*Education*

Bachelor’s degree in Actuarial Science

*Duration of assignment *(in Months)

12 Months

*Work Location *

Bloomington, IL

*Key words to search in resume*

OIPA Developer

*Prescreening Questionnaire*

Hands on Experience in OIPA







*Role-2-*

*Azure Databricks  *

*Location : Auburn Hills, MI**-Remote to Start*

*Rate: 60$/HR MAX*

*Job Title*

Azue Databricks

*Technical/Functional Skills*

   - Develop deep understanding of the data sources, implement data
   standards, and maintain data quality and master data management.
   - Expert in building Databricks notebooks in extracting the data from
   various source systems like DB2, Teradata and perform data cleansing, data
   wrangling, data ETL processing and loading to AZURE SQL DB.
   - Expert in building Ephemeral Notebooks in Databricks like wrapper,
   driver and config for processing the data, back feeding the data to DB2
   using multiprocessing thread pool.
   - Expert in developing JSON Scripts for deploying the Pipeline in Azure
   Data Factory (ADF) that process the data.
   - Expert in using Databricks with Azure Data Factory (ADF) to compute
   large volumes of data.
   - Performed ETL operations in Azure Databricks by connecting to
   different relational database source systems using jdbc connectors.
   - Developed Python scripts to do file validations in Databricks and
   automated the process using ADF.
   - Analyzed the SQL scripts and designed it by using Pyspark SQL for
   faster performance.
   - Worked on reading and writing multiple data formats like JSON,
   Parquet, and delta from various sources using Pyspark.
   - Developed an automated process in Azure cloud which can ingest data
   daily from web service and load in to Azure SQL DB.
   - Expert in optimizing the Pyspark jobs to run on different Cluster for
   faster data processing.
   - Developed spark applications in python (Pyspark) on distributed
   environment to load huge number of CSV files with different schema in to
   Pyspark Dataframes and process them to reload in to Azure SQL DB tables.
   - Analyzed data where it lives by Mounting Azure Data Lake and Blob to
   Databricks.
   - Used Logic App to take decisional actions based on the workflow and
   developed custom alerts using Azure Data Factory, SQLDB and Logic App.
   - Developed Databricks ETL pipelines using notebooks, Spark Dataframes,
   SPARK SQL and python scripting.
   - Developed Spark applications using Pyspark and Spark-SQL for data
   extraction, transformation and aggregation from multiple file formats for
   analyzing & transforming the data to uncover insights into the customer
   usage patterns.
   - Good Knowledge and exposure to the Spark Architecture including Spark
   Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node,
   Stages, Executors and Tasks.
   - Involved in performance tuning of Spark Applications for setting right
   Batch Interval time, correct level of Parallelism and memory tuning.
   - Expert in understanding current production state of application and
   determine the impact of new implementation on existing business processes.
   - Involved in Migration of data from On-prem server to Cloud databases
   (Azure Synapse Analytics (DW) & Azure SQL DB).
   - Good Hands on experience in setting up Azure infrastructure like
   storage accounts, integration runtime, service principal id, and app
   registrations to enable scalable and optimized utilization of business user
   analytical requirements in Azure.
   - Expert in ingesting streaming data with Databricks Delta tables and
   Delta Lake to enable ACID transaction logging.
   - Expert in building Delta Lake On top Of Data Lake and performing
   transformations in Delta Lake.
   - Expert in implementation of distributed stream processing platform
   with low latency and seamless integration, with data and analytics services
   inside and outside Azure to build your complete big data pipeline.
   - Expert in performance tuning of delta lake (optimize, rollback,
   cloning, time travel) implementation.
   - Developed complex SQL queries using stored procedures, common table
   expressions (CTEs), temporary table to support Power BI reports.
   - Development level experience in Microsoft Azure providing data
   movement and scheduling functionality to cloud-based technologies such as
   Azure Blob Storage and Azure SQL Database.
   - Independently manage development of ETL processes – development to
   delivery.







*Role 3-*

*MongoDB  *

*Work Location : Nashville, TN-Remote to Start*

*Rate: 55$/HR MAX*

*Job Title*

Mongo Db

*Relevant Experience*

*(in Yrs)*

10+ Years

*Technical/Functional Skills*

Bridgestone is looking for a skilled Mongo DB Technical Lead

Maintain and configure MongoDB instances

Keep clear documentation of the database setup and

architecture Write procedures for backup and disaster recovery

Ensure that the databases achieve maximum performance and availability
Design

indexing strategies Configure, monitor, and deploy replica

sets Upgrade databases through patches

Create roles and users and set their permissions Experience in optimizing
insertions of large amounts of data Experience with

Big Data solutions like Hadoop Experience designing systems that deal with
large data sets and a huge volume of transactions Demonstrate analytical,
problem-solving, presentation, and interpersonal skills to handle various
critical situations Good grasp of MongoDB’s aggregation framework

*Experience Required*

10+ Years

*Roles & Responsibilities*

Technical Lead

*Generic Managerial Skills*

Mongo DB

*Education*

Computer Graduate

*Work Location* (State, City and Zip)

Nashville, TN



*Key words to search in resume*

Mongo DB







*Bharat Chhibber | Sr. Technical Recruiter*

*Direct: 919 626 9615 | EMAIL bhara...@accurogroup.com
<bhara...@accurogroup.com>*

-- 
You received this message because you are subscribed to the Google Groups 
"Android Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to android-discuss+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/android-discuss/CAEmgVe1C-gbOcBsp%3DyniTyrp5RhFoKHuSMtPmJXtMTHUEu7o4A%40mail.gmail.com.

Reply via email to