Balasubramanian Swaminathan

Actuarial Platform & Data Engineering Leader

Education

Master of Engineering - Instrumentation, Madras Institute of Technology, Anna University, India

Bachelor of Engineering - Electrical and Electronics, Adhiparaskathi Engg. College, University of Madras, India

College Project

Fuzzy PID Controller with Genetic Algorithms

Designing a Fuzzy PID Controller with Genetic Algorithms

• Created a fuzzy-PID control system by blending fuzzy logic with classic PID techniques, tested extensively in MATLAB Simulink to handle the nonlinear quirks of instruments.

• Converted and optimized MATLAB simulations into clean, efficient C++ code for deployment in real-time embedded systems.

• Represented fuzzy membership levels and PID gains as chromosomes in a genetic algorithm, making automated tuning straightforward and comprehensive.

• Designed a multi-criteria fitness function that balances energy efficiency with performance metrics like squared error, overshoot, and settling time.

• Ran multiple iterations of the genetic algorithm in MATLAB to tune controller parameters effectively under different disturbance scenarios.

About Me

IT leader with 20 years of experience modernizing actuarial platforms
for life and health insurers. I define multi-year platform roadmaps,
drive high availability solutions, and deliver cost-efficient
cloud-native data architectures. Focused on automation,
observability, and team leadership.

Work Experience

Ramco Systems (2001 – 2008)

Senior Developer & Team Leader

Apr 2002 – Jan 2003: Ramco Optima

Migrated a generic process-industry expert-system into VC++/MFC6 with MS SQL and multithreading, enhancing UI functionality and memory robustness; validated real-time control loops on-site at Madras Cement Ltd.

Feb 2003 – Sep 2004: Data Modeling & Analysis Tool

Built a real-time soft-sensor in VC++/MFC using neural networks, PLS regression, clustering and PCA for data cleaning and dimensionality reduction; integrated OPC/SCADA data feeds and deployed for fineness prediction at Ambuja and MCL cement plants.

Feb 2005 – Aug 2005: Optima Enhancement

Led a 4-developer team to extend the Optima expert system with string/date-time data types, port it to 64-bit Itanium, and implement multithreading optimizations for sub-second cement and power-plant loop control.

Apr 2006 – Sep 2006: Genetic Programming Module

Architected a symbolic-regression soft-sensor tool in VC++/C#/VB .NET, leveraging genetic programming to auto-derive mathematical models (e.g., CO-emission prediction) for thermal power-plant analytics.

Jan 2007 – Apr 2007: Rule Engine Evaluation

Evaluated ILOG and JBossRule engines for enterprise lifecycle support, syntax and performance; wrapped rule services in Java/C# web APIs and integrated them into the Ramco Enterprise Suite.

Jun 2007 – Sep 2007: Business Process Optimization Editor & Solver

Directed the creation of a generic LP-solver with an ASCII-to-LP parser and embedded math editor, exposing solver functionality as RESTful web services within Ramco VirtualWorks for dynamic process automation.

Jan 2008 – Jun 2008: Route Optimization for Logistics

Led development of a C++/COM/SQL-based LP solver for the Vehicle Routing Problem with Time Windows, integrating it into Ramco VirtualWorks to deliver enterprise-grade logistics optimization.

Cognizant Technology Solutions, Chennai, India (Jun 2008 – Dec 2012)

Project Manager / Project Lead – Healthcare Domain

Jun 2008 – Mar 2009: RIOS Re-Platform (UHG International)

Led phase-2 re-platforming of RIOS from SQR/UNIX to a .NET/Crystal Reports solution on MS SQL 2005 and DB2; architected the data-access layer in C#, enforced coding standards, and delivered scalable, maintainable reporting services.

Apr 2009 – Dec 2011: MMS Data Management (Molina Medicaid – Maine & Idaho)

Directed a 16-member team to develop and maintain SSIS-based ETL workflows and VB .NET interfaces for Medicaid eligibility, enrollment, and claims data integration into QNXT. Key contributions:

  • Designed and automated daily ETL pipelines, eliminating manual processing and enhancing data reliability.
  • Developed and optimized SSRS reports by migrating legacy TR/CR reports and enabling real-time operational dashboards.
  • Established unit-testing, code-review, and mentoring processes to ensure high code quality and on-time delivery.
  • Managed full SDLC—requirements, design, implementation, testing, deployment, and 24×7 production support—tracking change requests and communicating risks and milestones to stakeholders.
  • Enforced HIPAA-compliant data governance through rigorous validation, audit trails, and performance tuning.

Jan 2012 – Jun 2012: Facets MSM ICD-10 Remediation (WellPoint)

Led a 10-member team through the ICD-10 upgrade of Cognizant MSM on UNIX; coordinated capacity planning, performance tuning of Sybase/Facets interfaces, and stakeholder communication to meet regulatory deadlines.

Jun 2012 – Dec 2012: CSP F2F Migration (UHG Facets Solution Group)

Managed end-to-end migration of customer-service modules into Facets using Sybase 15, IBM DataStage ETL, and MS Project; implemented risk-mitigation strategies, tracked milestones, and orchestrated cross-functional teams to achieve on-schedule go-live.

MetLife (via Cognizant), Onsite Technical Lead (2013 – 2020)

Jan 2013 – Dec 2015: Asset Projection & ETL Lead (Kamakura Risk Manager)

  • Managed BAU support for Kamakura Risk Manager COTS—ingesting investment, market and interest-rate scenario feeds into SQL Server for CFT, ALM and RBC projections.
  • Designed and implemented SSIS-based ETL pipelines sourcing from MarkIT EDM and legacy systems, improving data quality and reducing manual reconciliation by 40%.
  • Validated, transformed and loaded security-master, portfolio, loan and derivatives data into the KRM modeling schema, ensuring end-to-end data integrity.

Jan 2015 – Dec 2017: Data Warehouse Performance & Migration Lead

  • Migrated KRM's SQL Server asset-projection database to Microsoft Parallel Data Warehouse (PDW), achieving 5× query performance gains for large-scale HPC-grid runs.
  • Continued BAU support for Cash Flow Testing and GAAP Loss Recognition—tuning PDW distribution and indexing strategies for sub-minute report generation.
  • Led the transition of asset-allocation data feeds from mainframe to PDW, standardizing metadata and implementing automated load processes.

Jan 2017 – Dec 2019: Strategic Migration Lead (FIS Prophet ALS)

  • Spearheaded the replacement of KRM with FIS Prophet ALS for asset projections—oversaw Data Conversion System (DCS) pipelines to generate Model Point Files from existing investment feeds.
  • Integrated Prophet's encrypted output via the R2DB tool into SQL Server, preserving existing asset-projection formats and enabling seamless business continuity.
  • Provisioned and configured the HPC-grid run environment for Prophet ALS, coordinating with infrastructure teams to validate end-to-end workflow.

Jan 2018 – Dec 2020: Cloud Platform Modernization – Azure Asset Data Repository

  • Defined functional requirements and oversaw migration of on-prem SQL repositories to an Azure-based Asset Data Repository, leveraging Azure Data Lake and Databricks/Spark for scalable analytics.
  • Collaborated with cloud architects to design data ingestion patterns, security policies and CI/CD pipelines—accelerating data availability for downstream risk and finance teams.

Key Technologies: COTS (KRM, Prophet ALS), SQL Server, SSIS, PDW, HPC Grid, Azure Data Lake, Databricks/Spark, ETL/ELT, ALM/CFT/RBC projections, data governance.

MetLife – Group Benefits Valuation LDTI Compliance (Apr 2020 – Jan 2023)

SAFe 5.0 Scrum Master & Actuarial Data Engineer, Cary, NC

MetLife – Sr. Scrum Leader I / Application Lead (Oct 2022 – Dec 2024)

Unified Actuarial Platform (UAP) — TOSCA Automation Testing Lead (Jan 2025 – Present)

MetLife

Project mission: merge a half‑dozen legacy actuarial valuation tools into a single cloud‑native platform, slash infrastructure & licensing spend, and consolidate disparate EAI codebases into a maintainable, DevOps‑friendly stack.

My Impact in a Nutshell

Highlights & Achievements

1. End‑to‑End Test Strategy & Roadmap
  • Mapped > 90% of actuarial "money flows" across Cash‑Flow Testing, LDTI, IFRS & GAAP modules.
  • Introduced risk‑based test design that focuses automation where financial exposure is highest.
2. Scalable, Reusable TOSCA Framework
  • Built a plug‑in architecture (API, UI, Databricks, file‑transfer) with 80% component reuse across product lines.
  • Added data‑driven libraries and self‑healing locators, driving maintenance effort down 45% YoY.
3. CI/CD & DevOps Integration
  • Wired TOSCA into Azure DevOps pipelines: every PR spins up a disposable test environment, executes smoke + impact‑based regression, and posts results back to Git as status checks.
  • Containerized TOSCA Distributed Execution Agents (DEX) for frictionless scale‑out on Kubernetes/HPC nodes.
4. Databricks Data‑Integrity Automation
  • Created PySpark‑backed TOSCA modules that validate Delta‑Lake transformations against actuarial business rules, catching silent data drifts 48 hrs earlier than prior manual spot checks.
5. Actionable Quality Analytics
  • Defined a "single pane of glass" Power BI dashboard: coverage, pass‑rate trends, flakiness index, MTTR, and escape defect density.
  • Metrics now feed quarterly OKRs and have already supported a 30% drop in escaped production defects.
6. Regression Suite at Scale
  • Automated 600+ high‑risk scenarios spanning UI, REST, file‑transfer and grid‑compute workflows; nightly run completes in ≈ 3 hrs (vs 16 hrs manual) with fully automated triage hooks.
  • Demonstrated ROI within two sprints: first full suite uncovered a $5M reserve‑calculation defect before UAT.
7. Leadership & Collaboration
  • Coaches scrum teams on BDD, API virtualization and shift‑left peer testing; onboarded new hires with a two‑week "TOSCA boot camp."
  • Works daily with solution architects, actuaries and DevOps to align pipelines, environments and release cadences.

Key Technologies: TOSCA (v16+) • Azure DevOps Pipelines • Kubernetes & DEX • PySpark / Databricks • REST & gRPC • Power BI • Git • BDD (SpecFlow) • Kubernetes Secrets & Service Connections • SQL/Delta Lake validation

Bottom line: by marrying deep actuarial domain knowledge with modern test‑automation engineering, I'm helping MetLife deliver a unified platform that is faster to release, cheaper to run, and trustworthy at every step of the valuation lifecycle.

Databricks Upskilling Program for Actuarial Analysts (Oct 2024 – Present)

MetLife

Core Skills & Competencies

Contact

Email: sbala2me@gmail.com

Phone: +1 (848) 702-4902

Location: Cary, NC

GitHub: sbala2me

Linkedin: sbala2me

Resume