>
> Hello Associates,
>
> Please find the below Job Descriptions and kindly share your profiles to
> *[email protected]*
>
> *Position: PL/SQL Developer with Data Modeling*
>
>
>
> *Location: Boston, MA. Roles/Responsibilities:*
>
>    - Work in complex PL/SQL in development/support environment
>    - Analyze existing PL/SQL and respond to user’s clarifications
>    - Work in agile model for end-to-end development
>    - Respond to high priority issues and provide resolution immediately
>    - Good experience in PL/SQL and Complex SQL
>    - Experience in Data Modeling
>    - Must have Data Analysis and Data mapping experience
>    - Should have experience talking to business users and should have
>    strong communication and presentation skills
>    - Capable of working with business SME to get the functional
>    requirement, convert the requirement into technical reporting requirement
>    - Knowledge about tasks backlog tracking burn-down metrics velocity
>    user stories etc.
>    - Good experience in unit testing defect fixing
>    - Good knowledge in DW Concepts
>
>
>

*Role: Big Data EngineerLocation: Boston, MA.*


*Skills/Requirements:*

   - 4 to 7 years of recent experience in data engineering.
   - Bachelor’s Degree or more in Computer Science or a related field.
   - A solid track record of data management showing your flawless
   execution and attention to detail.
   - Strong knowledge of and experience with statistics.
   - Programming experience, ideally in Python
   
<https://insights.dice.com/employer-resource-center/recruiting-in-the-data-science-space-get-to-know-python/?ads_kw=data>,
   Spark, Kafka or Java
   
<https://insights.dice.com/employer-resource-center/what-recruiters-need-know-java/?ads_kw=java>,
   and a willingness to learn new programming languages to meet goals and
   objectives.
   - Experience in C, Perl, Javascript or other programming languages is a
   plus.
   - Knowledge of data cleaning, wrangling, visualization and reporting,
   with an understanding of the best, most efficient use of associated tools
   and applications to complete these tasks.
   - Experience in MapReduce is a plus.
   - Deep knowledge of data mining, machine learning, natural language
   processing, or information retrieval.
   - Experience processing large amounts of structured and unstructured
   data, including integrating data from multiple sources.
   - Experience with machine learning toolkits including, H2O, SparkML or
   Mahout
   - A willingness to explore new alternatives or options to solve data
   mining issues, and utilize a combination of industry best practices, data
   innovations and your experience to get the job done.
   - Experience in production support and troubleshooting.
   - You find satisfaction in a job well done and thrive on solving
   head-scratching problems.



> *Position: Java/Spark Developer with Azure*
>
> *Location: Boston, MA*
>
>
> *Required Skills:*
>
>    - Strong experience on *Java *(5 years minimum) which includes: Java –
>    OOPS concept, collections, concurrency, web services, Spring, Unit Testing
>    - Good experience with "big data" technologies such as *Azure and
>    Spark*
>    - Experience with CI tooling and concepts
>    - Experience with build automation tools
>    - Excellent verbal and written communication skills
>
>
> *Role: C++ Developer*
>
> *Location: Boston, MA.*
>
>
>
> *Requirements:*
>
> ·        Bachelor's degree and a minimum 5 years of experience with the
> below
> C, C++, Python, bash scripting, database and network programming
>
> ·        Knowledge of Linux and Unix platforms
>
> ·        Experience with practical cryptography functions (e.g., key
> management)
> Software development life cycle, including architecture, design,
> implementation, documentation, and testing
>
> ·        Experienced with formal test events, SON, Protobuf, and
> SQL-Based Databases, Java, SQL, Java Script, HTML, CSS and Matlab
>
> Network programming
>

>
> *Role: Snowflake Developer*
>
> *Location: Boston, MA.*
>
>
>
> *Basic Qualifications:*
>
>    - Minimum 2 year of developing a fully operational production grade
>    large scale data solution on Snowflake Data Warehouse.
>    - 3 years of hands on experience with building productionized data
>    ingestion and processing pipelines using Spark, Python
>    - 2 years of hands-on experience designing and implementing production
>    grade data warehousing solutions on large scale data technologies such as
>    Teradata, Oracle or DB2
>    - Expertise and excellent understanding of Snowflake Internals and
>    integration of Snowflake with other data processing and reporting
>    technologies
>    - Excellent presentation and communication skills, both written and
>    verbal
>    - Ability to problem solve and architect in an environment with
>    unclear requirements
>
>
>
> *Role: Azure DevOps Engineer with Kubernetes*
>
> *Location: Boston, MA.*
>
>
>
> *Requirements*
>
> The successful candidate will meet the following qualifications:
>
>    - At least 5 years demonstrated success in related IT profession,
>    including substantial DevOps experience and release management practices
>    for enterprise teams, and a passion for automation, Kubernetes, and
>    microservices.
>    - Experience applying *Azure DevOps* and *continuous integration and
>    continuous deployment (CI / CD)* concepts and *building CI / CD
>    pipelines* with source control such as GIT and *Azure DevOps* and Team
>    Foundation Server (TFS), Azure Stack, Azure Services On-Premises.
>    Automation with tools such as *Ansible,* Chef, or Puppet, or
>    PowerShell. Use GIT and GitOps for Kubernetes cluster management and
>    application delivery. Use Git as a single source of truth for declarative
>    infrastructure and applications. Use GitOps software agents to alert of any
>    divergence between Git and what is running in a cluster, and if there's a
>    difference, Kubernetes reconcilers will automatically update or rollback
>    the cluster depending on the case. Probably use Git as the center of the
>    CI/CD delivery pipelines and developers may make pull requests to
>    accelerate and simplify both application deployments and operations tasks
>    to Kubernetes. Infrastructure as code. Codification of infrastructure.
>    - Experience with Docker containerization, *Kubernetes,* ServiceMesh,
>    API Gateway and Proxy Server.
>
>
>
>
>
> *Role: TLM Developer*
>
> *Location: Boston, MA.*
>
>
>
>    - Transaction Lifecycle Management Premium application,
>    - a SmartStream product hosted and configured internally.
>    - Need developers on the latest versions of TLM, internally, known as
>    TLP
>
>
*Role: GoLang Developer*
*Location: Boston, MA.*

·        GoLang, Java or any other modern programming language

·        Docker, Kubernetes, Microservices Architecture

·        Payments experience

·        Experience in development in an agile model is a plus.

·        Development experience in web-related technologies such as Web
Services in REST, as per REST Principles.

·        Experience with Java8, SpringBoot, Core Spring, Spring MVC, Spring
Integration, Hibernate/JPA, JDBC, XSD, Tomcat, Oracle/MySQL.

·        Experiences with the entire software development lifecycle
including version control using Git, build process, CI/CD tools like
Jenkins, Kubernetes containerization, Functional testing framework, and
code release.

>
>
>
>
>
>
>
>
> Please share your profiles to *[email protected]*
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Android Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/android-developers/CAPw2yA7VQHc%2BqNWykQ%3DnTdW-uvK1w6oeTpM1eLoEZFn5Fxm46Q%40mail.gmail.com.

Reply via email to