Skip to main content

Search Jobs

Specialist, IT

Req # JR - 070650 Location Bengaluru, Karnataka, India Job Category Information Technology Date posted 05/27/2022

This is where you save and sustain lives

At Baxter, we are deeply connected by our mission. No matter your role at Baxter, your work makes a positive impact on people around the world. You'll feel a sense of purpose throughout the organization, as we know our work improves outcomes for millions of patients.

Baxter's products and therapies are found in almost every hospital worldwide, in clinics and in the home. For over 85 years, we have pioneered significant medical innovations that transform healthcare.

Together, we create a place where we are happy, successful and inspire each other. This is where you can do your best work.

Join us at the intersection of saving and sustaining lives—where your purpose accelerates our mission.

Sr Data Engineer (Specialist)

Sr Data Engineer is responsible for buildingapplication logic for various data integration layers.

The selected candidate will be responsible to deploy data integration framework on a traditional and big data platform which is conformed to global architecture & models.

The main task at hand is implementing the Data acquisition/ data integration layer and Data analysis capabilities for digital health analytics.  The candidate will work closely with Business Analysts (BAs), internal customers, dedicated technical developers, Data scientists and architects. 

The Data Engineer should have expertise to support projects utilizing a variety of analytical skills. This position will also support the business process and strategy activities as part of the digital health Analytics organization, which includes analyzing large data sets pertaining to different business processes. In this position, there is an opportunity to interact with Legal, Commercial, Finance, and Research & Development as project requirements dictate to get project objectives completed.

Responsibilities

Responsibilities of this role include but are not limited to:

  • Retrieve, prepare, and transform variety of data sources
  • Analyze and implement data integration layers to support analysis
  • data processing and data mining using open source technologies.
  • Hands on Dimension modeling experience
  • Hands on with AWS ecosystem for Data ingestion, Storage, Processing services like Amazon S3, AWS Glue, Amazon EC2, Athena, SNS, RDS etc.
  • Proficient in writing SQL queries and PL/SQL packages in Relational databases like Oracle, Postgres , Amazon RDS.
  • Hands on with data manipulation on RDD, DataFrames in PySpark.
  • Hands on with writing PySpark/Python code in AWS Glue.
  • Hands on with using transformation libraries used in AWS Glue for data manipulation and database interaction. E.g. Boto3, Pandas, pg8000, Cx_oracle etc.
  • Hands on experience with Redshift and Snowflake.
  • Hands on with working on Linux environment in Amazon EC2 or equivalent environment
  • Hands on with CLI commands used in AWS ecosystem to interact with services like S3, EC2, AWS Glue, RDS etc
  • Determining operational objectives by studying business functions, gathering information, and evaluating output requirements and formats.
  • Creating and documenting protocols for quality assurance and reporting through standard operating procedure documents.
  • Experience in validation and selftesting code
  • Convert business requirements to technical specification
  • Document functional and design requirements, testing results, change control or any validation documentation needed for projects assigned
  • Manage activities of self  in achieving defined quality goals in an efficient, accurate and timely manner

Qualifications

  • Bachelor's degree required. Preferably in computer science, software engineering, business administration, economics, mathematics, statistics or a related field.
  • Strong programming skills in data integration and analysis.
  • 6+ years of experience working with large data sets is preferred.
  • Ability to work well in team-based environment.
  • Ability to work independently and internal client relationships.
  • Strong problem solving and troubleshooting skills with the ability to exercise mature judgment.
  • Strong work ethic with attention to detail and a commitment to client service excellence.
  • Excellent analytical skills, problem solving skills, strong verbal and written communication skills.  
  • Be a self-starter who can prioritize tasks, manage deadlines, navigate and be successful in a fast-paced, dynamic work environment.
  • Hands on Dimension modeling experience with 2+ yrs
  • 3+ years of python and PySpark experience
  • 3+ years of AWS and unix experience
  • Preferred certifications:
  • AWS Certified Cloud Practitioner (amazon.com)
  • Python / PySpark certifications

Reasonable Accommodations

Baxter is committed to working with and providing reasonable accommodations to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the application or interview process, please send an e-mail to [email protected] and let us know the nature of your request along with your contact information.

Recruitment Fraud Notice

Baxter has discovered incidents of employment scams, where fraudulent parties pose as Baxter employees, recruiters, or other agents, and engage with online job seekers in an attempt to steal personal and/or financial information. To learn how you can protect yourself, review our Recruitment Fraud Notice.

070650

Join Our Talent Community

We're grateful for your interest in a career with Baxter, and would like to get to know you. Joining our Talent Community is a great way to stay connected, learn more about Baxter, and help our recruiters find you if there's an opportunity that aligns with your background, skills and interests.

Join Now
join our talent community

    You have not recently viewed any jobs.

    You have not saved any jobs.

Back to Top