Fuzzy PID Controller with Genetic Algorithms
Designing a Fuzzy PID Controller with Genetic Algorithms
• Created a fuzzy-PID control system by blending fuzzy logic with classic PID techniques, tested extensively in MATLAB Simulink to handle the nonlinear quirks of instruments.
• Converted and optimized MATLAB simulations into clean, efficient C++ code for deployment in real-time embedded systems.
• Represented fuzzy membership levels and PID gains as chromosomes in a genetic algorithm, making automated tuning straightforward and comprehensive.
• Designed a multi-criteria fitness function that balances energy efficiency with performance metrics like squared error, overshoot, and settling time.
• Ran multiple iterations of the genetic algorithm in MATLAB to tune controller parameters effectively under different disturbance scenarios.
About Me
IT leader with 20 years of experience modernizing actuarial platforms
for life and health insurers. I define multi-year platform roadmaps,
drive high availability solutions, and deliver cost-efficient
cloud-native data architectures. Focused on automation,
observability, and team leadership.
Work Experience
Ramco Systems (2001 – 2008)
Senior Developer & Team Leader
Apr 2002 – Jan 2003: Ramco Optima
Migrated a generic process-industry expert-system into VC++/MFC6 with MS SQL and multithreading, enhancing UI functionality and memory robustness; validated real-time control loops on-site at Madras Cement Ltd.
Feb 2003 – Sep 2004: Data Modeling & Analysis Tool
Built a real-time soft-sensor in VC++/MFC using neural networks, PLS regression, clustering and PCA for data cleaning and dimensionality reduction; integrated OPC/SCADA data feeds and deployed for fineness prediction at Ambuja and MCL cement plants.
Feb 2005 – Aug 2005: Optima Enhancement
Led a 4-developer team to extend the Optima expert system with string/date-time data types, port it to 64-bit Itanium, and implement multithreading optimizations for sub-second cement and power-plant loop control.
Apr 2006 – Sep 2006: Genetic Programming Module
Architected a symbolic-regression soft-sensor tool in VC++/C#/VB .NET, leveraging genetic programming to auto-derive mathematical models (e.g., CO-emission prediction) for thermal power-plant analytics.
Jan 2007 – Apr 2007: Rule Engine Evaluation
Evaluated ILOG and JBossRule engines for enterprise lifecycle support, syntax and performance; wrapped rule services in Java/C# web APIs and integrated them into the Ramco Enterprise Suite.
Jun 2007 – Sep 2007: Business Process Optimization Editor & Solver
Directed the creation of a generic LP-solver with an ASCII-to-LP parser and embedded math editor, exposing solver functionality as RESTful web services within Ramco VirtualWorks for dynamic process automation.
Jan 2008 – Jun 2008: Route Optimization for Logistics
Led development of a C++/COM/SQL-based LP solver for the Vehicle Routing Problem with Time Windows, integrating it into Ramco VirtualWorks to deliver enterprise-grade logistics optimization.
Cognizant Technology Solutions, Chennai, India (Jun 2008 – Dec 2012)
Project Manager / Project Lead – Healthcare Domain
Jun 2008 – Mar 2009: RIOS Re-Platform (UHG International)
Led phase-2 re-platforming of RIOS from SQR/UNIX to a .NET/Crystal Reports solution on MS SQL 2005 and DB2; architected the data-access layer in C#, enforced coding standards, and delivered scalable, maintainable reporting services.
Apr 2009 – Dec 2011: MMS Data Management (Molina Medicaid – Maine & Idaho)
Directed a 16-member team to develop and maintain SSIS-based ETL workflows and VB .NET interfaces for Medicaid eligibility, enrollment, and claims data integration into QNXT. Key contributions:
- Designed and automated daily ETL pipelines, eliminating manual processing and enhancing data reliability.
- Developed and optimized SSRS reports by migrating legacy TR/CR reports and enabling real-time operational dashboards.
- Established unit-testing, code-review, and mentoring processes to ensure high code quality and on-time delivery.
- Managed full SDLC—requirements, design, implementation, testing, deployment, and 24×7 production support—tracking change requests and communicating risks and milestones to stakeholders.
- Enforced HIPAA-compliant data governance through rigorous validation, audit trails, and performance tuning.
Jan 2012 – Jun 2012: Facets MSM ICD-10 Remediation (WellPoint)
Led a 10-member team through the ICD-10 upgrade of Cognizant MSM on UNIX; coordinated capacity planning, performance tuning of Sybase/Facets interfaces, and stakeholder communication to meet regulatory deadlines.
Jun 2012 – Dec 2012: CSP F2F Migration (UHG Facets Solution Group)
Managed end-to-end migration of customer-service modules into Facets using Sybase 15, IBM DataStage ETL, and MS Project; implemented risk-mitigation strategies, tracked milestones, and orchestrated cross-functional teams to achieve on-schedule go-live.
MetLife (via Cognizant), Onsite Technical Lead (2013 – 2020)
Jan 2013 – Dec 2015: Asset Projection & ETL Lead (Kamakura Risk Manager)
- Managed BAU support for Kamakura Risk Manager COTS—ingesting investment, market and interest-rate scenario feeds into SQL Server for CFT, ALM and RBC projections.
- Designed and implemented SSIS-based ETL pipelines sourcing from MarkIT EDM and legacy systems, improving data quality and reducing manual reconciliation by 40%.
- Validated, transformed and loaded security-master, portfolio, loan and derivatives data into the KRM modeling schema, ensuring end-to-end data integrity.
Jan 2015 – Dec 2017: Data Warehouse Performance & Migration Lead
- Migrated KRM's SQL Server asset-projection database to Microsoft Parallel Data Warehouse (PDW), achieving 5× query performance gains for large-scale HPC-grid runs.
- Continued BAU support for Cash Flow Testing and GAAP Loss Recognition—tuning PDW distribution and indexing strategies for sub-minute report generation.
- Led the transition of asset-allocation data feeds from mainframe to PDW, standardizing metadata and implementing automated load processes.
Jan 2017 – Dec 2019: Strategic Migration Lead (FIS Prophet ALS)
- Spearheaded the replacement of KRM with FIS Prophet ALS for asset projections—oversaw Data Conversion System (DCS) pipelines to generate Model Point Files from existing investment feeds.
- Integrated Prophet's encrypted output via the R2DB tool into SQL Server, preserving existing asset-projection formats and enabling seamless business continuity.
- Provisioned and configured the HPC-grid run environment for Prophet ALS, coordinating with infrastructure teams to validate end-to-end workflow.
Jan 2018 – Dec 2020: Cloud Platform Modernization – Azure Asset Data Repository
- Defined functional requirements and oversaw migration of on-prem SQL repositories to an Azure-based Asset Data Repository, leveraging Azure Data Lake and Databricks/Spark for scalable analytics.
- Collaborated with cloud architects to design data ingestion patterns, security policies and CI/CD pipelines—accelerating data availability for downstream risk and finance teams.
Key Technologies: COTS (KRM, Prophet ALS), SQL Server, SSIS, PDW, HPC Grid, Azure Data Lake, Databricks/Spark, ETL/ELT, ALM/CFT/RBC projections, data governance.
MetLife – Group Benefits Valuation LDTI Compliance (Apr 2020 – Jan 2023)
SAFe 5.0 Scrum Master & Actuarial Data Engineer, Cary, NC
- Agile Delivery & Team Leadership: Served as Scrum Master under SAFe 5.0 for IT enabler and BAU teams in the Group Benefits reserve domain, facilitating PI planning, iteration planning, reviews and retrospectives to drive on-time delivery.
- End-to-End ETL Pipeline Development: Architected and implemented data pipelines from diverse administrative feeds into the PolySystem Health Master and Life Master environments, automating reserve-calculation inputs for the Group Reserve database to support Long Duration Targeted Improvement (LDTI) reporting.
- Process Automation & Data Governance: Designed scalable workflows with embedded key controls, error-handling routines, and logical transformations—reducing manual intervention by over 50% and ensuring HIPAA-grade data integrity for disability income valuations.
- Security Compliance Coordination: Led FTP credentials remediation across multiple source systems, partnering with security and operations teams to meet audit requirements and strengthen data-access controls.
- Stakeholder Collaboration & Performance Optimization: Worked closely with actuarial, finance, and IT stakeholders to refine data models, tune pipeline performance, and transition reserve-processing workflows from Admin System to the PolySystem platform and then upload the reserve in Global reserve repository, ensuring seamless business continuity.
MetLife – Sr. Scrum Leader I / Application Lead (Oct 2022 – Dec 2024)
- Directed Japan & Korea LDTI valuation runs (production + regression), chairing daily grid‑run stand‑ups and publishing monthly migration dashboards to executives.
- Served as Product Owner for LDTI automation backlog – delivered F4WL coinsurance & YRT true‑up, MRB/DAC model merge, TUTAM & Propensity‑Roll file promotions, and Korea file‑transfer automation, trimming cycle time 40%.
- Closed a high‑severity data‑lineage audit finding by scripting a mainframe‑to‑CSV Perl utility, enabling field‑level reconciliation and securing Deloitte/SOX sign‑off ahead of schedule.
- Led DIL application portfolio (Feb '24‑Dec '24): automated Prophet‑to‑Poly asset flows, tuned EMEA GRR pipelines, built multi‑region AFDF transfers, and onboarded new IFRS & Actuarial regions—supporting UAT & production across EMEA, LATAM, APAC, and U.S. lines.
- Cut infra costs ≈ $10K per environment by migrating AFDF/DIL/ADAM codebase from Anaconda to open‑source Python—identifying compatible packages and updating scripts with zero downtime.
- Championed modernization – scoped SSIS‑to‑ADF migration, completed SAFe RTE/Scrum‑Master and Azure + Databricks DE certifications, and mentored teams on cloud & DevSecOps best practices.
- Trusted business liaison: supported actuaries in Japan, Korea & Malaysia, drove risk logs, and coordinated 12+ audits with zero follow‑up findings.
Unified Actuarial Platform (UAP) — TOSCA Automation Testing Lead (Jan 2025 – Present)
MetLife
Project mission: merge a half‑dozen legacy actuarial valuation tools into a single cloud‑native platform, slash infrastructure & licensing spend, and consolidate disparate EAI codebases into a maintainable, DevOps‑friendly stack.
My Impact in a Nutshell
- Architect & owner of UAP's enterprise‑grade TOSCA practice — from vision and tooling to day‑to‑day execution.
- People leader for a 12‑member, cross‑skill QA guild (API, UI/UX, Databricks data‑integrity, DevOps integration).
- Automation evangelist: shifted teams left, embedded quality gates in CI/CD, and cut regression cycle time from 2 days to ≈ 3 hours.
Highlights & Achievements
1. End‑to‑End Test Strategy & Roadmap
- Mapped > 90% of actuarial "money flows" across Cash‑Flow Testing, LDTI, IFRS & GAAP modules.
- Introduced risk‑based test design that focuses automation where financial exposure is highest.
2. Scalable, Reusable TOSCA Framework
- Built a plug‑in architecture (API, UI, Databricks, file‑transfer) with 80% component reuse across product lines.
- Added data‑driven libraries and self‑healing locators, driving maintenance effort down 45% YoY.
3. CI/CD & DevOps Integration
- Wired TOSCA into Azure DevOps pipelines: every PR spins up a disposable test environment, executes smoke + impact‑based regression, and posts results back to Git as status checks.
- Containerized TOSCA Distributed Execution Agents (DEX) for frictionless scale‑out on Kubernetes/HPC nodes.
4. Databricks Data‑Integrity Automation
- Created PySpark‑backed TOSCA modules that validate Delta‑Lake transformations against actuarial business rules, catching silent data drifts 48 hrs earlier than prior manual spot checks.
5. Actionable Quality Analytics
- Defined a "single pane of glass" Power BI dashboard: coverage, pass‑rate trends, flakiness index, MTTR, and escape defect density.
- Metrics now feed quarterly OKRs and have already supported a 30% drop in escaped production defects.
6. Regression Suite at Scale
- Automated 600+ high‑risk scenarios spanning UI, REST, file‑transfer and grid‑compute workflows; nightly run completes in ≈ 3 hrs (vs 16 hrs manual) with fully automated triage hooks.
- Demonstrated ROI within two sprints: first full suite uncovered a $5M reserve‑calculation defect before UAT.
7. Leadership & Collaboration
- Coaches scrum teams on BDD, API virtualization and shift‑left peer testing; onboarded new hires with a two‑week "TOSCA boot camp."
- Works daily with solution architects, actuaries and DevOps to align pipelines, environments and release cadences.
Key Technologies: TOSCA (v16+) • Azure DevOps Pipelines • Kubernetes & DEX • PySpark / Databricks • REST & gRPC • Power BI • Git • BDD (SpecFlow) • Kubernetes Secrets & Service Connections • SQL/Delta Lake validation
Bottom line: by marrying deep actuarial domain knowledge with modern test‑automation engineering, I'm helping MetLife deliver a unified platform that is faster to release, cheaper to run, and trustworthy at every step of the valuation lifecycle.
Databricks Upskilling Program for Actuarial Analysts (Oct 2024 – Present)
MetLife
- Designed a bespoke "Actuary‑to‑Databricks" curriculum that blends actuarial use‑cases (reserve roll‑forwards, LDTI analytics, ALM simulations) with hands‑on PySpark labs—turning domain experts into self‑sufficient data engineers in 12 weeks.
- Built a companion microsite featuring bite‑sized videos, code‑along notebooks, cheat sheets, and an interactive Q&A forum; 85% of learners return weekly for refreshers or advanced topics.
- Led live cohort training for 100+ actuaries—delivering four virtual boot‑camps, 20 office‑hour sessions, and personalized "code clinics" that resolved > 250 real‑world notebook blockers.
- Gamified the journey with badges and capstone challenges; 70% of participants earned the "Databricks Bronze" badge and 35% progressed to "Silver," completing a full PySpark model‑point‑file pipeline.
- Integrated learning with day‑to‑day work: shipped starter notebooks and reusable widgets that plug directly into UAP's Delta Lake tables, cutting exploratory data‑prep time from hours to minutes.
- Captured and published success metrics on a Power BI dashboard—showing a 40% drop in analyst‑to‑IT help tickets and a 25% speed‑up in quarterly valuation data wrangling.
- Institutionalized the program by training six internal "champions," creating a self‑sustaining community of practice that now hosts monthly lightning talks and notebook‑swap sessions.
Core Skills & Competencies
- Actuarial Platforms: FIS Prophet (ALS, IFRS 17, LDTI), PolySystems, Kamakura
- Cloud & Data Engineering: Azure (Synapse, Databricks, Data Factory), AWS (Glue, EMR), Snowflake
- DevOps & Automation: CI/CD, Kubernetes, Terraform, Ansible, RPA
- Observability & SRE: Application Insights, Prometheus, Grafana, Azure Automation
- Leadership & Delivery: Agile/SAFe, PMP, Scrum Master, cross-functional team building