The Job logo

What

Where

Senior Software Engineer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

The primary responsibilities of this role, Senior Software Engineer, are to: 

 

  • Analyze, design and develop tests and test-automation suites.
  • Design and develop a digital platform for research and development.
  • Test software development methodology in an agile environment.
  • Provide ongoing maintenance, support and enhancements in existing systems and platforms.
  • Collaborate cross-functionally with data scientists, business users, project managers and other engineers to achieve elegant solutions.
  • Provide recommendations for continuous improvement.
  • Work alongside other engineers on the team to elevate technology and consistently apply best practices.
  • Recommend upgrades for existing systems and programs
  • Create various diagrams, flowcharts and models that illustrate the type of code needed for programmers
  • Identify and assess new technologies prior to implementation

 

WHO YOU ARE          

Your success will be driven by your demonstration of our LIFE values. More specifically related to this position, Bayer seeks an incumbent who possesses the following:

 

Required Qualifications: 

  • Minimum of a Bachelor's Degree in Computer Science or relevant discipline;
  • Minimum of five years’ experience with C#, NodeJS, ReactJS, Python;
  • Knowledge of professional software engineering and best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations (any relevant platform environment, incl. AWS cloud);
  • Experience in development of distributed/scalable systems and high-volume transaction applications, Unit testing, version control (GIT, SVN, etc.), and peer code reviews Code optimization and Coding guidelines and tools for checking them; 
  • Experience creating unit tests, integration testing, and test automation;
  • Experience serving as technical lead throughout the full software development lifecycle, from conception, architecture definition, detailed design, scoping, planning, implementation, testing to documentation, delivery, and maintenance are preferred;
  • Demonstrate practical experience setting up and leveraging Amazon Web Services technologies;
  • Knowledge of current development methods for the industrialization of software development (Continuous Integration/Testing/Delivery, etc.), Software development lifecycles (SDLC); agile methodologies like Agile/SCRUM, test driven development.

 

 Preferred Qualifications:          

  • Experience with stream processing: Kafka, Spark Streaming, Akka, Flink, etc.;
  • Experience data modeling for large scale databases, either relational or NoSQL (graph, key: value, document, etc.);
  • Experience with building APIs using GraphQL, node.JS, React, and other technologies..
Set alert for similar jobsSenior Software Engineer role in Bangalore Urban, India
Bayer Logo

Company

Bayer

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

0-2 years

Locations

Bangalore Urban, Karnataka, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Bayer Logo

Data Steward

Bayer

Bangalore Urban, Karnataka, India

Posted: a year ago

POSITION PURPOSE: Develop and deploy data-based sustainable solutions while working with R&D scientists, IT & data teams and answer important questions that drive key decisions for our business.   YOUR TASKS AND RESPONSIBILITIES: Defines data quality rules and implement automated monitoring, reporting, and remediation solutions. Implements and fine-tunes data governance guidelines, policies, processes, and controls. Ensures data consistency across multiple systems and business units Coordinates design sessions with Stewards, Data Engineers, Engineering teams, Data Scientists, Product Managers, business and/or IT stakeholders, that result in design documentation and business metadata capture Participates in trainings and discussions to evangelize these frameworks and objectives - Governance, Data Quality, Data Wrangling, and Best Practices. Maintains records of adequate data collection, maintenance, and usage Implements and utilizes data solutions for data analysis and profiling using a variety of tools such as Postman, R or Python and following team’s established processes and methodologies Collaborates with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts Utilizes root cause analysis to identify trends and assess impact of data quality issues. Supports data migration from legacy systems, data inserts and updates not supported by applications. Participates in data scraping, data curation and data compilation efforts. Ensures high quality of the data to end users. Ensures high quality of the inhouse data via data stewardship. Ensures adoption of taxonomy and ontology for the compiled data to end users Has digital mindset and knowledge on Python/R programming to automate Data stewardship workflows. Participates in Open Data efforts, making data FAIR (Findable, Accessible, Interoperable and Reusable) to strengthen effectiveness   WHO YOU ARE: Master’s Bachelor's Degree in Computer Science, Engineering, Crop Science, Agriculture, or another related field. Solid experience in areas such as: Relevant business domain Querying SQL and/or NoSQL databases Managing data using APIs Semantic Intelligence and Knowledge Graph Manipulating data using scripting languages and/or data processing software (e.g. Python, R, Pipeline Pilot) and Data management/governance /ETL applications such as Tibco EBX, Talend or Indigo) Profiling data, summarizing, and reporting data quality metrics Ability to deliver detailed technical documentation Experience handling sensitive data. Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc. Knowledge on modern engineering technologies and data principles, for instance: Big Data, Cloud Computing, NoSQL, etc. Understanding of data architecture and modeling Knowledge of industry data practice/governance models (DAMA, CMMI, DGI, etc.) and data strategy frameworks (Gartner, St Gallen, etc.) Knowledge of data management best practices. Knowledge of business or data domain within a business unit

Walmart Logo

Senior Data Engineer

Walmart

Bangalore Urban, Karnataka, India

Posted: a year ago

What you'll do:   Drive design, development, implementation, documentation, and follow the agile development process.  Build, test and deploy cutting edge solutions at scale, impacting associates of Walmart worldwide.  Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community.  Be involved in the successful implementation by applying technical skills, to design and build enhanced processes and technical solutions in support of strategic initiatives.  Work closely with the Tech Leads/Architects and cross-functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost & Delivery) within the established architectural guidelines.  Learn new and changing technologies and adapt to ensure best software practices and standards.  Participate in hiring and support to build teams  Interact closely for requirements with Business owners and technical teams both within India and across the globe.  You will help and participate with the teams that leverage and contribute to open-source technologies to make an impact on a global scale      What you'll bring:     Bachelor's/Master's degree in engineering with 3-6 years of experience in design and development of highly scalable applications and platform development in product-based companies or R&D divisions  Excellent problem solving, critical thinking and communication skills.  Must be able to work effectively both on teams as well as be self-motivated, task oriented and organized. Ability to adapt to change quickly, willingness to learn new and emerging technologies.  Strong customer focus and obsession with quality. Strong in algorithms, data structures and distributed systems.  Strong Programming skills in Java, Scala or Python. Strong in writing modular and testable code and test cases (unit, functional and integration). Strong Debugging /Profiling skills. Strong knowledge of big data ecosystem - Data Warehousing, Data Lake, ETL, ELT, Spark/PySpark , Hive, SQL, Presto or similar technology on OLAP, Any scheduler like Airflow Strong in Database and Data Modelling - SQL, OLAP, OLTP, STAR Schema, Snowflakes schema, Fact & dimensions Tables, SQL.  Good understanding of CI/CD process (desirable) Good understanding of  Dremio, Druid is a plus (good to have) Good understanding of any Visualisation tool like Tableau (good to have) Working knowledge of GCP based tools like Big Query, Dataproc, GCS (good to have) Good Understanding of Data Quality, Data Lineage and Data Governance experience (good to have) Good exposure with Pub-sub like Kafka/Kinesis, any NoSQL DB like Cassandra, Cosmos DB (good to have)