Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Lekshmi Nair

Brussels

Summary

A Certified Scrum Master with 9+ years of dedicated experience in data engineering, committed to facilitating agile methodologies and fostering high-performing teams. Expertise in data architecture, ETL processes, and data analytics and skilled in bridging the gap between technical teams and business stakeholders to deliver data-driven solutions that drive organizational success.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Scrum Master

TCS-Proximus
Brussels
08.2023 - Current
  • Facilitated sprint planning, daily stand-up meetings, retrospectives, and reviews for EDH Squad.
  • Worked with product owners and end users to prioritize feature development and enhancements based on critical paths and business needs.
  • Encouraged collaboration among team members through pair programming activities and knowledge-sharing sessions.
  • Supported the development team in understanding user stories, acceptance criteria, and definition of done.
  • Enforced scrum values such as transparency, respect, commitment, courage, focus, openness, and inspection.

Data Engineer

TCS -Proximus
Brussels
12.2021 - 09.2023
  • Worked as a Data Engineer in Proximus which is the leading supplier of global telecommunications solutions on the Belgian market.
  • Build a central data hub for batch-oriented data flows, mini-batch data flows, near real-time, and streaming data feeds using Proximus Framework services.
  • Migration of different Proximus legacy source systems to CDP.
  • Developed and managed detailed deployment plans, ensuring alignment with project timelines and business objectives.Set up and maintain CI/CD pipelines using Gitlab to streamline the deployment process.

Data Engineer

TCS - EasyJet
Kochi
01.2020 - 10.2021
  • Worked as Data Engineer in EasyJet Airlines where a data hub is created which will Ingest data from different Legacy Data Sources, Load it into Hadoop HIVE tables and Transform it into User Defined Format which can be consumed by reporting tools like Tableau for User Interface
  • Pyspark, Hive, Workflows, Shell scripting were leveraged to develop the framework.
  • Worked in designing and developing Data ingestion framework and onboarded new data sources into DataHub
  • Have proactively resolved Production L3 issues.
  • Coordinated and centralized the production release processes.
  • Collaborated with data science team to understand and implement the requirements for curation and transformation

Scala Developer

TCS - Belgium Rail/NMBS
Brussels
12.2019 - 12.2020
  • Worked as a developer in WIDO project which is a migration project in Belgian Rail to migrate Sales Application for regular and season tickets which is implemented in C and Cobol to Scala in Play framework.
  • Converting the pricing engine of NMBS in legacy technologies (C and COBOL) to a high performing, modern, scalable, micro-service, high performing, concurrent and highly available solution.
  • Migration from legacy applications to cutting edge technologies proved huge difference in performance in terms of response time.
  • Apart from migration from the legacy application we expose a lot of smaller underlying services that can be used by other NMBS applications like mobile apps.
  • Created an interface layer (Tcpx) that will allow all existing applications to switch to WIDO, without any modification in the code of those applications which helps in business continuity.
  • We compared the results of legacy application and WIDO thoroughly in terms of each fields in result and response time

Spark Developer

TCS - Nielsen
Kochi
09.2017 - 11.2019
  • Worked as a developer in a connect application where the framework converts and move source data (Finished goods) by transforming and merging customer database into a target data model which bridge the gap between Data Lake and Reporting platforms
  • Spark and Scala were leveraged to develop the framework and do transformations and aggregations.
  • Create detailed workflows, prototypes and effectively communicate the interactive design to customer
  • Migrate Hive queries to Spark using Scala.
  • Responsible for transforming customer database to presentation ready database for reporting layer, so that impala and snowflake can make use of this database for visualization.
  • Customer-provided data is transformed and merged to a common data model consisting of dimension-specific lookup tables and fact tables using Spark SQL and Scala
  • Experienced in converting business process into RDD transformations using Apache Spark and Scala.

Bigdata Developer

TCS -Nielsen
Kochi
06.2015 - 08.2017
  • Worked as a Developer in ETL team where an ETL framework /Data lake is designed and developed in big data system by integrating and transforming huge amount of data from several sources into one common platform to service the needs of Nielsen's clients
  • Java, MapReduce, Spark, Scala, SheII Scripting and HQL were leveraged to develop the framework.
  • Worked in designing and developing the ETL model and loading data into Data Lake
  • Onboarded data from 100 countries in a single platform under a common data model.
  • Contributed to Designing and developing the common format tables.
  • Extensively worked in writing, fine tuning hive jobs for optimized performance.
  • Resolved challenges faced to handle diverse structure of data being received from different retailers. Involved in deploying multi module applications using Maven and Jenkins.

Education

Bachelor of Technology - Electronics And Communications Engineering

Kerala University
India
06.2014

Skills

Agile Tools :

-Target Process

-Jira

-Confluence

-Miro

-FigJam

Programming Languages:

-Scala

-SQL

-Python

-Java

Big Data Technologies:

-Hadoop

-Spark

-Hive

-Impala

-Oozie

-Kafka

Tools:

-WinSCP

-Putty

-Cmder

-Intellij

-Eclipse

-VS Code

CICD

-Git,Github,Gitlab

-SVN

-Bitbucket

-Jenkins

Certification

  • Professional Scrum Master 1
  • AZ-900: Microsoft Azure Fundamentals
  • DP-900: Microsoft Azure Data Fundamentals
  • HDE 100 - Hadoop Essentials
  • DEV 360 - Apache Spark Essentials

Timeline

Scrum Master

TCS-Proximus
08.2023 - Current

Data Engineer

TCS -Proximus
12.2021 - 09.2023

Data Engineer

TCS - EasyJet
01.2020 - 10.2021

Scala Developer

TCS - Belgium Rail/NMBS
12.2019 - 12.2020

Spark Developer

TCS - Nielsen
09.2017 - 11.2019

Bigdata Developer

TCS -Nielsen
06.2015 - 08.2017

Bachelor of Technology - Electronics And Communications Engineering

Kerala University
Lekshmi Nair