Resume/CV

Summary

Strategically focused data-engineering manager with a track record of implementing large data systems while being technically competent to deal with implementation details. Proven ability to lead multiple high-profile projects simultaneously. Adept at working across teams in order to take users’ requirements and turn them into working systems.

Experience

2/2022 - Present

Chief Data Architect

Stellantis: Remote from Tampa Bay, FL

I lead international teams that enable world-class data products to create the future of mobility in connected vehicles. I manage both data engineering and DevOps teams at Stellantis that increase the value and availability of connected vehicle data.

Other responsibilities and accomplishments:

  • Responsible for a budget in the tens of millions
  • Creating a robust data infrastructure for connected vehicle data
  • Leading big data design sessions
  • Mentoring employees on data engineering and functional programming

5/2019 – 2/2022

Senior Manager, ETL

Activision Publishing: Remote from Tampa Bay, FL

I led a diverse team of data engineers located across the globe that provided data products to our data science partners and analytics analysts throughout Activision. Also, worked with data scientists on feature engineering design and implementation.

Other responsibilities and accomplishments:

  • Responsible for all silver/gold level data for the entire Call of Duty franchise
  • Successfully led the ETL migration from Amazon Web Services (AWS) to Google Cloud Platform (GCP)
  • Led the effort to create a Spark-based ETL framework that made ETL of data self-service in many instances. This framework is used by Call of Duty mainline, Call of Duty Mobile and the EMEA sales team
  • Led the effort that migrated a legacy ETL system our partners created to normalize Call of Duty data from various titles, which resulted in a 50% reduction in the time to data availability
  • Responsible for the adoption of Astronomer Airflow from Apache Airflow
  • Conducted the Databricks evaluation and adoption, which improved delivery times of data products
  • Led the effort to migrate off of Qubole/Hive/Presto to Databricks
  • Responsible for multi-million dollar contracts on data products
  • Lead the team that is the primary authority on using Spark at Activision on using Python, Scala, and SparkSQL
  • Led the effort to migrate legacy systems from Redshift to the Databricks Delta Lake to improve scalability
  • Drove the adoption of MLFlow for data science workflows within the Activision Data Science community
  • Extending Spark functionality through user-defined functions
  • Integrating third-party data with Activision’s game data to drive profitability
  • Created the reference architecture for third-party data ingestion using Cats/ZIO, Circe, and http4s on Kubernetes
  • Leading the implementation of GDPR, CCPA, and other privacy measures within the data lake
  • Working with Activision game studios on transforming game artifact data into formats queriable in Spark SQL
  • Retooling small data tools from game studios to work within the Activision big data frameworks
  • Improving Activision’s Diversity, Equity, and Inclusion efforts by expanding the hiring pipeline of potential candidates
  • Mentoring employees in Airflow, Kubernetes, Scala, and Spark

6/2014 – 4/2019

Lead Database Architect

Activision Publishing: Boulder, CO

Responsible for a wide variety of functions within Activision’s Central Data Products organization – Activision’s data science team. Recommending game instrumentation, performing analytic research/model building, and supporting game designers were essential in this role. I led the widespread adoption of Apache Airflow and Spark within Activision.

Other responsibilities and accomplishments:

  • Provided analytics on optimizing vehicles and weapons in Call of Duty – Black Ops IV
  • Worked with data scientists to build various models to improve gameplay performance
  • Improved the performance on the Play of The Match (PTOM) simulation by 20%
  • Introduced the use of Apache Spark at Activision and trained data scientists in its use
  • Created Spark extensions in Scala for dealing with encoded data
  • Interviewed and recommended data engineers and data scientists
  • Created a Cassandra cluster for storing Personal Identifying Information (PII) for Activision’s GDPR initiative
  • Implemented a GDPR PII data store using a streaming Akka composed of Scala, Kafka, and Cassandra
  • Led the adoption of Airflow within Activision
  • Migrated Airflow from DCOS to Kubernetes
  • Managed multiple Hive, Presto, and Spark clusters within Qubole
  • Created best practices for using big data technology such as Presto and Apache Spark
  • Designed and implemented a data pipeline service to capture prelaunch and beta data for Call of Duty titles

1/2013 – 6/2014

Senior Consultant

FICO: Remote

Responsible for leading a team that was enhancing credit and retail applications for the largest bank in the U.S. Also, implemented recommendation systems for major pharmaceutical companies leveraging Python, Vertica, and Pentaho.

Other responsibilities and accomplishments:

  • Led the team that saved FICO’s relationship with the nation’s largest bank
  • Successfully implemented a credit card fraud application that was delayed by over a year by another team
  • Architected and implemented ETL processes using Pentaho Data Integration
  • Converted Kia’s application for deciding the best dealer and best service dealer for KIA’s customers from VB.Net and PostgreSQL to Java and Vertica
  • Mentored developers on other ETL techniques utilizing Python

10/2011 – 12/2012

Senior ETL Architect

Productive Data Solutions: Denver, CO

Responsible for providing guidance and implementing ETL systems for clients.

Other responsibilities and accomplishments: • Designed and implemented ETL processes using a combination of Pentaho Data Integration and Python • Created a data mapping solution in Django, JQuery, and Oracle for migrating to databases where the source and target schemas are different • Recommended and moved a client’s Pentaho repository to a file-based system utilizing subversion resulting in a reduction of deployment time from over an hour to 30 seconds • Mentored developers on other ETL techniques utilizing Python • Implemented a HIPPA reporting system using Python • Mentored QA staff on automation through Linux shell scripting • Improved sprint velocity by 50% by leading an effort to enhance user stories for ETL sprints by working with business analysts and clients • Responsible for interviewing SQL developers


3/2006 – 10/2011

Software Architect

Transzap: Denver, CO

Responsible for improving the performance of software products for customers and internal data systems. Other responsibilities and accomplishments:

  • Interviewed and recommended potential candidates for software development positions
  • Introduced Python as a way to quickly do data transformations and to automate different tasks
  • Wrote multiple Python applications to verify the integrity of our system conversions and upgrades
  • Migrated Transzap’s legacy e-payables system running on Orion to the latest generation system on Tomcat
  • Introduced columnar database technology (Vertica) to offload reporting load from our transactional database
  • Converted SSAS cubes to Vertica, greatly simplifying access to the data using SQL instead of MDX
  • Created Java web services to perform analytical queries against Vertica and returning the results as XMLA
  • Implemented several ETL systems with Pentaho Data Integration
  • Designed and implemented systems that contributed to Transzap being recognized by the Deloitte Fast 500
  • Wrote SQL Server Integration Services to migrate data into a data warehouse
  • Reduced start-up time of Spendwork’s C# application from minutes to seconds

  7/2000 – 3/2006

Application Architect

Calpine: Fort Collins, CO

Responsible for developing standards for data warehousing, XML, web services, and service-oriented architecture (SOA). Also responsible for designing systems that allowed Calpine’s personnel to monitor their power plant fleet for economic efficiency.

Other responsibilities and accomplishments:

  • Directly contributed to Calpine’ 5th place ranking in the InformationWeek top 100 innovators (InformationWeek Sept. 19, 2005 issue)
  • Provided architectural oversight to numerous development projects
  • Budgeted projects and set initial project management timelines. Projects varied in size and scope and were in the range of up to $600K
  • Led reviews on all database designs for new systems or enhancements to current systems
  • Evaluated business intelligence tools and obtaining buy-in from all information services organizations within Calpine
  • Created data warehousing and OLAP application using ASP.Net, SQL Server, SSAS for comparing meter data for natural gas and electric power sales
  • Implemented an OLAP cube for power plant fleet reliability analysis
  • Created a data warehouse to automate reporting from a Maximo inventory system using SQL Server, Window Services written in C# and DTS
  • Designed, developed, and tested back-end components that gathered in real-time telemetry from power plants located throughout the country
  • Create a tool that mapped data about plant information from various systems using C#, ADO.NET, Oracle, and OSI PI.
  • Designed the database that maintained power plant meta-data for the Calpine fleet
  • Modified C++/MFC based libraries to deal with different contract periods a plant may encounter
  • Designed, developed, and created an OLAP server from scratch that cached different periods like gas days, peaking periods, and off-peak periods
  • Was key to Calpine’s early adoption of Microsoft’s .Net technologies as we worked directly with Microsoft on the C# language

12/1999 – 7/2000

Systems Analyst II

City of Thornton: Thornton, CO

Responsible for training MIS staff in C++, COM, MTS, and ASP development. Also, implemented, maintained, and upgraded mission-critical systems.

Other responsibilities:

  • Interviewing new development staff
  • Recommended software purchases
  • Researched and implemented a development life cycle for internal projects
  • Maintained and upgraded various systems used by the City of Thornton

  1/1999 – 12/1999

Information Technology Lead

VantagePoint Network: Fort Collins, CO

Led the creation of one first web-based agricultural platforms to assist crop professionals. This platform allowed crop producers to work with producers to suggest strategies to increase crop yields while reducing costs and environmental impact.

Other responsibilities:

  • Designed and built a system for taking GPS based yield card information from combines using C++, ATL COM, ADO, MTS, MSMQ, Oracle, and SDE
  • Designed and built a system to store soil test information using C++ ATL COM objects with ADO, MTS, Oracle, and SDE
  • Implemented a COM object for migrating spatial data in an ESRI SDE Oracle database using C++, ATL COM, and the SDE API
  • Built an NT Service with a COM interface for encrypting user id and password information using C++, ATL, and Visual Basic
  • Implemented COM business objects written in C++ using ATL, ADO, MTS, and Oracle
  • Instrumental in designing and building a web-based crop record management system using ASP, ADO, and Oracle.
  • Built installation programs for deployment between our development, test, and production systems using Wise, VBScript, and MTS package exports.
  • Assisted the QA department in coming up with guidelines for reporting bugs, testing, and correcting bugs
  • Collaborated with database designers on the design of the crop database system
  • Organized and participated in code review sessions
  • Interviewed and recommended potential candidates for information technology positions

2/1995 – 1/1999

Programmer/Analyst

State Farm Insurance Companies: Bloomington, IL

Responsible for designing and implementing mission-critical business systems over a variety of insurance products.

Other responsibilities:

  • Designed and built COM objects to integrate third-party insurance software packages with State Farm’s legacy systems
  • Designed an intranet-based application for online policy rating using COM, COM TI, DB2, DHTML, and MTS
  • Maintained C/C++ coding standards for my area and acted as mentor and resource to analysts in C/C++ and MFC
  • I taught and mentored C++ after hours to employees
  • Designed and built a system to replicate marketing data for up to 5000 locations using C++, DB2, and MQ Series
  • Debugged a third party MFC C++ application for doing life insurance illustrations at the vendor’s site
  • Enhanced an expert system written in AionDS for pricing auto policies
  • Tuned and debugged COBOL applications
  • Worked with business analysts to discover business rules for expert systems
  • Maintained PL/1 applications that gathered information from various IMS databases as input to different Expert systems

1/1996 - 12/1996

C++ Instructor

Heartland Community College: Bloomington, IL

Taught C++ course that covered the fundamentals of the language and object-oriented programming. Some of the key concepts covered were:

  • Analysis and design
  • Classes
  • Exception handling
  • Inheritance
  • Overloaded operators
  • Polymorphism

  5/1993 – 1/1995

Computer Operator

Rockwell Automation Allen – Bradley: Mequon, WI

Responsible for assisting systems analysts and system programmers with mainframe, network, and PC problems.

Other responsibilities:

  • Participated in several ISO 9000 audits and maintained documentation for procedures on running the mainframe
  • Wrote JCL and REXX to run programs and backups
  • Maintained the telecommunication reports on usage and availability
  • Assisted in implementing automatic scheduling of applications on the mainframes

Education

University of Wisconsin - Milwaukee
Bachelor of Business Administration (December 1994)
Major: Management Information Systems