Kurt Fehlhauer


I am a software engineering leader with a track record of delivering systems in complex environments.


2/2022 - Present

Chief Data Architect

Stellantis: Remote from Odessa, FL

As the leader of an international team at Stellantis, I oversee the development and implementation of world-class data products that contribute to the advancement of mobility in connected vehicles. This includes managing software and data engineering teams, as well as DevOps teams, to construct the necessary infrastructure to support data science, engineering, and marketplace activities.

Additional responsibilities and accomplishments include:

  • Successfully forecasting and managing a data platform budget in the tens of millions of dollars.
  • Developing a robust data infrastructure for connected vehicle data.
  • Collaborating with data privacy officers to implement effective anonymization strategies for the protection of personally identifiable information (PII).
  • Facilitating big data design sessions to drive innovation and progress.
  • Advocating for functional design principles and developing numerous systems using Scala and the ZIO framework
  • Providing mentorship and guidance to managers and leading engineers to foster professional growth and development.

5/2019 – 2/2022

Senior Manager, ETL

Activision Publishing: Remote from Tampa Bay, FL

I served as Senior Manager of ETL at Activision Publishing, leading a diverse international team of data engineers. My team provided data products and support to data science partners and analytics analysts throughout the company. I also collaborated with data scientists to design and implement feature engineering.

Additional responsibilities and accomplishments include:

  • Managing all silver/gold level data for the Call of Duty franchise
  • Leading the successful migration of ETL processes from Amazon Web Services (AWS) to Google Cloud Platform (GCP)
  • Developing a Spark-based ETL framework that enabled self-service ETL for many instances, which was implemented by the Call of Duty mainline, Call of Duty Mobile, and the EMEA sales team
  • Migrating a legacy ETL system to normalize Call of Duty data from various titles, resulting in a 50% reduction in data availability time
  • Adopting Astronomer Airflow from Apache Airflow and conducting the evaluation and adoption of Databricks, which improved data product delivery times
  • Migrating off of Qubole/Hive/Presto to Databricks and migrating legacy systems from Redshift to the Databricks Delta Lake to improve scalability
  • Driving the adoption of MLFlow for data science workflows within the Activision Data Science community
  • Extending Spark functionality through user-defined functions, integrating third-party data with Activision’s game data to drive profitability
  • Creating the reference architecture for third-party data ingestion using Cats/ZIO, Circe, and http4s on Kubernetes.
  • Leading the implementation of GDPR, CCPA, and other privacy measures within the data lake
  • Collaborating with Activision game studios on transforming game artifact data into formats queriable in SparkSQL and retooling small data tools from game studios to work within the Activision big data frameworks
  • Improving Diversity, Equity, and Inclusion efforts by expanding the hiring pipeline of potential candidates, and mentoring employees in Airflow, Kubernetes, Scala, and Spark
  • Managing multi-million dollar contracts on data products and leading the team that is the primary authority on using Spark at Activision on using Python, Scala, and SparkSQL

6/2014 – 4/2019

Lead Database Architect

Activision Publishing: Boulder, CO

As a lead data engineer within Activision’s Central Data Products organization, I played a key role performing analytic research and model building and supporting game designers. Through my efforts, I led the widespread adoption of Apache Airflow and Spark within the company.

Additional responsibilities and accomplishments include:

  • Providing analytics to optimize vehicles and weapons in Call of Duty: Black Ops IV
  • Collaborating with data scientists to build various models to improve gameplay performance
  • Improving the performance of the Play of The Match (PTOM) simulation by 20%
  • Introducing the use of Apache Spark at Activision and training data scientists in its use
  • Creating Spark extensions in Scala for dealing with encoded data
  • Interviewing and recommending data engineers and data scientists
  • Designing and implementing a Cassandra cluster for storing Personal Identifying Information (PII) for Activision’s GDPR initiative
  • Developing a GDPR PII data store using a streaming Akka composed of Scala, Kafka, and Cassandra
  • Leading the adoption of Airflow within Activision
  • Migrating Airflow from DCOS to Kubernetes
  • Managing multiple Hive, Presto, and Spark clusters within Qubole
  • Establishing best practices for using big data technology such as Presto and Apache Spark
  • Designing and implementing a data pipeline service to capture prelaunch and beta data for Call of Duty titles

1/2013 – 6/2014

Senior Consultant

FICO: Remote

As a team lead, I was responsible for enhancing credit and retail applications for the largest bank in the U.S. and implementing recommendation systems for major pharmaceutical companies using Python, Vertica, and Pentaho.

Additional responsibilities and accomplishments include:

  • Leading the team that saved FICO’s relationship with the nation’s largest bank
  • Successfully implementing a credit card fraud application that had been delayed by over a year by another team
  • Architecting and implementing ETL processes using Pentaho Data Integration
  • Converting Kia’s application for determining the best dealer and best service dealer for KIA’s customers from VB.Net and PostgreSQL to Java and Vertica
  • Mentoring developers on other ETL techniques utilizing Python
  • Interviewing and recommending staff for hiring.

10/2011 – 12/2012

Senior ETL Architect

Productive Data Solutions: Denver, CO

As an ETL consultant, I was responsible for providing guidance and implementing ETL systems for clients.

Additional responsibilities and accomplishments include:

  • Designing and implementing ETL processes using a combination of Pentaho Data Integration and Python
  • Developing a data mapping solution using Django, JQuery, and Oracle for migrating to databases where the source and target schemas are different
  • Recommending and implementing a client’s Pentaho repository to a file-based system utilizing subversion, resulting in a significant reduction of deployment time from over an hour to 30 seconds
  • Mentoring developers on other ETL techniques utilizing Python
  • Implementing a HIPPA reporting system using Python
  • Mentoring QA staff on automation through Linux shell scripting
  • Improving sprint velocity by 50% by leading an effort to enhance user stories for ETL sprints by working with business analysts and clients
  • Being responsible for interviewing SQL developers

3/2006 – 10/2011

Software Architect

Transzap: Denver, CO

As a Software Engineer, I was responsible for improving the performance of software products for customers and internal data systems.

Additional responsibilities and accomplishments include:

  • Introducing Python as a way to perform data transformations and automate various tasks quickly
  • Developing multiple Python applications to verify the integrity of our system conversions and upgrades
  • Migrating Transzap’s legacy e-payables system running on Orion to the latest generation system on Tomcat
  • Introducing columnar database technology (Vertica) to offload reporting load from the transactional database
  • Converting SSAS cubes to Vertica greatly simplifying access to the data using SQL instead of MDX
  • Creating Java web services to perform analytical queries against Vertica and returning the results as XMLA
  • Implementing several ETL systems with Pentaho Data Integration
  • Designing and implementing systems that contributed to Transzap being recognized by the Deloitte Fast 500
  • Developing SQL Server Integration Services to migrate data into a data warehouse
  • Reducing the start-up time of Spendwork’s C# application from minutes to seconds

7/2000 – 3/2006

Application Architect

Calpine: Fort Collins, CO

As a senior applications architect, I was responsible for establishing standards and best practices for data warehousing, XML, web services, and service-oriented architecture (SOA) at Calpine. I also designed and implemented systems that allowed Calpine’s personnel to monitor the performance and efficiency of their power plant fleet.

Additional responsibilities and accomplishments include:

  • Directly contributing to Calpine’s 5th place ranking in the InformationWeek top 100 innovators (InformationWeek Sept. 19, 2005 issue)
  • Providing architectural oversight to numerous development projects
  • Interviewing and recommending potential candidates for software development positions
  • Budgeting projects and setting initial project management timelines, with projects ranging in size and scope up to $600K
  • Leading reviews on all database designs for new systems or enhancements to current systems
  • Evaluating business intelligence tools and gaining buy-in from all information services organizations within Calpine
  • Developing a data warehousing and OLAP application using ASP.Net, SQL Server, SSAS for comparing meter data for natural gas and electric power sales
  • Designing and implementing an OLAP cube for power plant fleet reliability analysis
  • Developing and implementing a data warehouse to automate reporting from a Maximo inventory system using SQL Server, C#, and DTS
  • Designing, developing, and testing back-end components for real-time telemetry gathering from power plants located throughout the country
  • Creating a data mapping tool using C#, ADO.NET, Oracle, and OSI PI to gather information from various systems
  • Designing and developing the database that maintained power plant meta-data for the Calpine fleet
  • Modifying C++/MFC based libraries to handle different contract periods for power plants
  • Designing, developing, and creating an OLAP server from scratch that cached different periods such as gas days, peaking periods, and off-peak periods
  • Playing a key role in Calpine’s early adoption of Microsoft’s .Net technologies by working directly with Microsoft on the C# language.

12/1999 – 7/2000

Systems Analyst II

City of Thornton: Thornton, CO

As a systems analyst, I was responsible for instructing MIS staff in advanced programming languages and technologies, such as C++, COM, MTS, and ASP. Additionally, I was responsible for the implementation, maintenance, and upgrade of mission-critical systems for the City of Thornton.

Additional responsibilities and accomplishments include:

  • Interviewing and recommending potential candidates for software development positions
  • Advising on software purchases and the implementation of a development life cycle for internal projects
  • Providing technical support and maintenance for various systems used by the City of Thornton
  • Continuously improving system performance, reliability, and security through regular upgrades and updates

1/1999 – 12/1999

Information Technology Lead

VantagePoint Network: Fort Collins, CO

As a software developer, I played a key role in creating one of the first web-based agricultural platforms to assist crop professionals in increasing crop yields while reducing costs and environmental impact.

Additional responsibilities and accomplishments include:

  • Designing and building a system for collecting GPS-based yield card information from combing technologies such as C++, ATL COM, ADO, MTS, MSMQ, Oracle, and SDE
  • Designing and building a system for storing soil test information, utilizing C++ ATL COM objects with ADO, MTS, Oracle, and SDE
  • Implementing a COM object for migrating spatial data in an ESRI SDE Oracle database using C++, ATL COM, and the SDE API
  • Building an NT Service with a COM interface for encrypting user id and password information using C++, ATL, and Visual Basic
  • Implementing COM business objects written in C++ using ATL, ADO, MTS, and Oracle • Being instrumental in designing and building a web-based crop record management system using ASP, ADO, and Oracle
  • Building installation programs for deployment between development, test, and production systems using Wise, VBScript, and MTS package exports.
  • Assisting the QA department in creating guidelines for reporting bugs, testing, and correcting bugs
  • Collaborating with database designers on the design of the crop database system
  • Organizing and participating in code review sessions

2/1995 – 1/1999


State Farm Insurance Companies: Bloomington, IL

As a programmer/analyst, I was responsible for designing and implementing mission-critical business systems for various insurance products.

Additional responsibilities and accomplishments include:

  • Designing and building COM objects to integrate third-party insurance software packages with State Farm’s legacy systems
  • Designing an intranet-based application for online policy rating using technologies such as COM, COM TI, DB2, DHTML, and MTS
  • Maintaining coding standards for my area and acting as a mentor and resource for analysts in C/C++ and MFC
  • Teaching and mentoring C++ to employees during after-hours sessions
  • Designing and building a system to replicate marketing data for up to 5000 locations using C++, DB2, and MQ Series
  • Debugging a third-party MFC C++ application for performing life insurance illustrations at the vendor’s site
  • Enhancing an expert system written in AionDS for pricing auto policies
  • Tuning and debugging COBOL applications
  • Working with business analysts to discover business rules for expert systems
  • Maintaining PL/1 applications that gathered information from various IMS databases as input to different expert systems.

1/1996 - 12/1996

C++ Instructor

Heartland Community College: Bloomington, IL

As an instructor of C++ programming, I was responsible for teaching the fundamentals of the language and object-oriented programming to students.

Key concepts covered included:

  • Analysis and design principles
  • Class and object creation
  • Exception handling techniques
  • Inheritance and polymorphism
  • Overloaded operators and their applications
  • Best practices in C++ programming

5/1993 – 1/1995

Computer Operator

Rockwell Automation Allen – Bradley: Mequon, WI

Assisted systems analysts and system programmers with mainframe, network, and PC operations and maintenance.

Additional responsibilities and accomplishments include:

  • Contributed to the successful completion of several ISO 9000 audits by maintaining documentation for procedures related to mainframe operations
  • Demonstrated proficiency in writing JCL and REXX scripts to run programs and backups
  • Monitored and reported on telecommunication usage and availability
  • Was a key player in the implementation of automated scheduling for mainframe applications to improve efficiency and reliability


University of Wisconsin - Milwaukee
Bachelor of Business Administration (December 1994)
Major: Management Information Systems
Major GPA: 3.67
Overall GPA: 3.21
Dean’s List: Fall 1993, Spring 1994, and Fall 1994