Data Engineer (Azure)
IBM
Singapore, Central Singapore Community Development Council, Singapore
Introduction At IBM, work is more than a job – it’s a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you’ve never thought possible. Are you ready to lead in this new era of technology and solve some of the world’s most challenging problems? If so, lets talk. Your Role and Responsibilities As a DTT Engineer/Architect, you will guide the technical evaluation phase as well as during the design and development phase in a hands-on environment in the area of Data Platform, Internet of Things (IoT) and Automation, Analytics including AI and Machine Learning, as well as Blockchain. You will be a technical advisor internally to the sales and delivery team, and work with the product (analytics or data) team as an advocate of your customers in the field. You’ll grow as a leader in your field, while finding solutions to our customers’ biggest challenges in big data, IoT, automation, data engineering and data science and analytics problems. As a Data engineer or Solution Architect you will provide services to clients in the analytics or data related solutioning and delivery of complex projects/programs for cloud and non-cloud environments, including complex application and/or system integration projects. You will help our customers to achieve tangible data-driven outcomes through the use of Data Engineering frameworks or Data Platform or in the area of Automation and Blockchain, helping data and analytics teams complete projects and integrate our platform into their enterprise Ecosystem. You will be responsible in terms of stitching together architectural landscape starting from data acquisition, ingestion and transformation before loading the same in the desire data warehouses in form of datamarts as per the requirement. You will also facilitate the process of how the curated data could be consumed by downstream application in order to meet the business requirement in form of Management Information System or Analytics solutions. The solution architect will build architectures & coordinate with other architects to build an end to end prescriptive guidance across network, storage, operating systems, virtualization, RDBMS & NoSQL databases, and mid-tier technologies that include application integration, in-memory caches, and security. Requirements • Overall 12+ years of (consulting) experience focused in data and analytics. • Have a good understanding of data warehousing, ETL, complex event processing, data engineering, Big Data principles and data visualization, Data Sciences, Business Intelligence, Analytics products etc • experienced in working in a hybrid cloud environment and exposure to Big Data framework is a must. • Proficient understanding of distributed computing principles • Deep experience with distributed systems, large scale non-relational data stores, map-reduce systems, data modelling, database performance, and multi-terabyte data warehouses • Knowledge in the area of internet of things including IoT related device knowledge is a must for the role • Desired knowledge in the area of containerization framework like Kubernetes or Red Hat Open Shift is an added advantage for the role • Desired knowledge in the area of API/ Microservices development is a good to have skills • Exposure in managing and implement integrations between internal and external solutions • Demonstrated experience in collaborating with domain architecture leadership • Extensive development expertise in Spark and other Big Data processing frameworks (Hadoop, Storm, Kafka etc) • Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala • Knowledge of various ETL techniques and frameworks, such as Flume and stream processing systems like Storm or Spark-Streaming • Programming knowledge and skill with SQL, NoSQL, Python and PySpark • Working knowledge of other BI / Analytics / Big Data tools (IBM Cognos, QlikView, HortonWorks, Cloudera, Azure Data Factory, Automation Anywhere, BluePrism) is a plus. • Experience in creating end to end blueprint, estimating the effort, pricing and risk assessment of the solution • Excellent communication skills with an ability to lead right level conversations.