*Hi,* *Good Day!*
*1)* *Role: Business Intelligence Consultant (Data Works)* *Location: Remote* *Duration: Long Term* *Job Description:* · 10+ Years in DW/BI space in both designing the model and building analytics. · At least executed two end-to-end implementations. · At least worked for min. 2 year in Azure SQL and ADF and for 3+ year in building Tableau model/reports/dashboards · Experience in developing and optimizing stored procedures and functions using T-SQL for data migration and transformation · Master in Azure SQL stack architecture (SQL Server, SQL Warehouse and Azure Data factory) · Technical knowledge of python, Pentaho, or shell scripting is required · Coordinate with business users and prepare documents to plan and control various volumes of data. · Maintain data warehouse systems and provide production support. · Evaluate all data and perform various tests on data warehouse and ensure effective implementation. · Administer all data and ensure compliance with standards and policies. · Maintain high quality of all deliverables and provide efficient guidance to all database architecture. · Self-motivated, passionate for technology. *2)* *Position: Drupal Software Engineer* *Location: Remote (Candidate need to work in CST time zone)* *Duration: Long Term* *Responsibilities: * · Help lead the design, implementation, and support of the new digital platform. · Collaborate with engineering, design, and product teams to ensure the goals of the product are aligned with what is delivered. · Manage the technology delivery lifecycle (Development, Test, Deploy, Support) · Perpetuate a data-driven culture with high standards of operational excellence and continuous improvement. *Qualifications: * · Experience with modern JavaScript frameworks, HTML, and CSS. · Extensive experience building complex applications using object-oriented JavaScript/TypeScript leveraging the best of modern Web frameworks (ESNext, Vue, React, Angular, etc.) · Experience with building front-end JavaScript applications using RESTful interfaces. · Experience writing JavaScript unit tests and using standard Web testing frameworks · Some DevOps skills; able to get hands dirty on Linux, AWS, Docker, CI pipelines, etc. · Good understanding of JavaScript command-line tools: Grunt, Gulp, NPM, etc. *Preferred Qualifications: * · Familiarity with JS native technologies (React Native, NativeScript, Ionic, etc.) for iOS/Android · UX / UI design experience a plus *Technologies we use: * · PHP / Java / Python / HTML / CSS / JavaScript · Modern JavaScript frameworks such as React, Angular or Vue · Software: Magento / Drupal / Pimcore · Cloud platforms: AWS / Azure *3)* *Position: SDET Engineer* *Location: Remote* *Duration: 6+ Months* *Position Insite: * · Looking for someone with Senior SDET with very deep hand on experience on Ruby / RSpec/Cucumber and experience working with large , legacy Ruby test suite *Roles and Responsibilities:* As a Senior Software Engineer in Test, you will be embedded into a software development team as the expert of both automated and exploratory testing. You will take a lead role in planning, designing, building, and executing tests for our loan application and payments user interfaces, APIs, and related components. Additionally, you will coach and mentor the development team to refine best practices and continuously improve testing skill sets. Specifically, you will: · Pair with developers to build and expand automated testing tools and frameworks with a focus on UI testing. · Assess the risks of new solutions and draft test plans to reduce or mitigate those risks. · Mentor and train engineers in automation and testing practices. · Provide expertise to the development team on testing standards and practices for both automated and exploratory testing. · Implement testing tools and practices throughout the entire SDLC, including observability and "tests in production." · Collaborate with the product team to design solutions that meet deadlines. · Guide the improvement of quality processes as an expert for the development team and the Technological organization as a whole. *What you will bring to the team: * · 7+ years of experience testing UX and API components · Experience designing and developing automated UI tests. · Experience with one or more test automation frameworks such as Cypress, Test Cafe, Cucumber, RSpec, JUnit, TestNG, or similar frameworks. · Experience with Ruby and JavaScript, including object-oriented design principles. · History of collaborating with development teams building and maintaining automated testing frameworks. · Familiarity with all aspects of testing, including performance, security, and privacy. · Understanding of the different approaches needed to test multiple architectures, such as micro-services, and event-driven messaging. 4) *Role: Dell Boomi Developer* *Location: Remote* *Duration: Long Term* *Job Description:* · 5 years of development experience - Boomi development with prior Experience in TIBCO BW/Mule Soft/any EAI/ESB development. · Complex Integration and Web Services Exp( Both Rest and Soap) · Environmental Setup Exp · Dell Boomi Architecture Exp *5)* *Role: Big Data Engineer* *Location: Remote* *Long term* *Mandatory Skills: **AWS ( S3, SNS, Lambda), Spark, Spark Streaming (must) , Python, Redshift/Snowflake* *Job Description:* - Build cool things – Build software across our entire cutting-edge data platform, including data processing, storage, and serving large-scale web APIs, with awesome cutting-edge technologies operating in real-time with high-availability. - Harness curiosity – Change how we think, act, and utilize our data by performing exploratory and quantitative analytics, data mining, and discovery. - Innovate and inspire – Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action. - Think at scale - Lead the transformation of a peta-byte scale batch based processing platform to a near real-time streaming platform using technologies such as *Apache Kafka, Cassandra, Spark and other open source frameworks. * - Have pride – Ensure performance isn’t our weakness by implementing *and refining robust data processing, REST services, RPC (in an out of HTTP), and caching technologies.* - *Architect and build the data ecosystem to enable data activation and segmentation capabilities across growth and retention marketing* - *Collaborate with lifecycle and product marketing teams to drive subscriber engagement by using data forward solutions to messaging* - Build integrations with platforms to optimize marketing spend through data and analytics - Partner with marketing stakeholders to gather data requirements and design solutions that perform at scale - Develop batch and real-time data pipelines, integrate martech stack with other engineering services such as personalization and experimentation - *Create data catalogs and validations to ensure quality and correctness of key operational datasets and metrics* - *Coach data engineers on best practices and technical concepts of building large scale data platforms* - Work in an Agile environment that focuses on collaboration and fosters a culture of innovation and excellence - Grow with us – Help us stay ahead of the curve by working closely with data architects, stream processing specialists, API developers, our DevOps team, and analysts to design systems which can scale elastically in ways which make other groups jealous. - Lead and coach – Mentor other software engineers by developing re-usable frameworks. Review design and code produced by other engineers. - Build and Support – Embrace the DevOps mentality to build, deploy and support applications in cloud with minimal help from other teams. - Not your first rodeo – Have 5+ years of experience developing with a *mix of languages ( Scala, Python etc.) and open source frameworks to implement data ingest, processing, and serving technologies in near-time basis.* - Data and API ninja –You are also very handy with big data *framework such as Hadoop & Apache Spark, Spark Sreaming, No-SQL systems such as Cassandra or DynamoDB, Streaming technologies such as Apache Spark; Understand reactive programming and dependency injection such as Spring to develop REST services*. - *Have a technology toolbox – Hands on experience with newer technologies relevant to the data space such as Spark, Kafka, Apache Druid (or any other OLAP databases).* - *Experience engineering big-data solutions using technologies like Databricks, EMR, S3, Spark* - *Demonstrated understanding of data engineering tools and practices, including tools and platforms such as Airflow, Databricks, Snowflake, and Jenkins* - *Experience with deploying and running AWS-based data solutions and familiar with tools such as Cloud Formation, Kinesis, DynamoDB, Athena and Redshift* - Cloud First - Plenty of experience with developing and deploying in a cloud native environment preferably AWS cloud. - Embrace ML – Work with data scientists to operationalize machine learning models and build apps to make use of power of machine learning. - Problem solver – Enjoy new and meaningful technology or business challenges which require you to think and respond quickly. Best Regards, Ramesh Lead Recruiter E-Mail:[email protected] www.qcentrio.com # 405 State Hwy 121, Suite A250 Lewisville, Texas, 75067 -- You received this message because you are subscribed to "rtc-linux". Membership options at http://groups.google.com/group/rtc-linux . Please read http://groups.google.com/group/rtc-linux/web/checklist before submitting a driver. --- You received this message because you are subscribed to the Google Groups "rtc-linux" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/rtc-linux/CABzzWfNHmaDCY2OL3tt7FfKZyzn8w-t_OWBsi549LioHit12Aw%40mail.gmail.com.
