Resume

  • 303-588-3412
  • Accomplished data & database architect, data modeler and database developer with extensive experience with agile development of data warehouses, data marts, and operational data stores informed by both Kimball and Inmon methodologies. Experienced designing and implementing large databases while leveraging database tools for design, metadata, ETL, and code control to rapidly deploy fully functional systems. Assembles and leads teams by organizing workload and mentoring junior developers as required.
  • Able to view situations from any or all of the following perspectives; Business, Architectural, Macroscopic, Microscopic. Possesses excellent analytical and communication skills with an ability to organize information for, and relate to, all levels of management and systems personnel.
  • Team member on over 15 data marts/data warehouses as Tech Lead, Architect/Designer, Data Modeler and or ETL developer. Several were for global corporations incorporating data from multiple ERP systems in multiple languages and character sets. Recent multi-terabyte experience.
  • Former Naval Officer and Navy Nuclear Power Operator
Education
University of Colorado
Boulder, Colorado
Bachelors of Science, Electrical Engineering and Computer Science
Practice areas
Roles
  • Data Warehouse Architect
  • Data Architect
  • ETL Architect
  • Team Leader
  • Data Modeler
  • Project Manager
  • Data Profiler
  • PL/SQL Developer
Tools and Methods

ER/Studio
Powerdesigner
Erwin
Oracle versions 6 to 11.2g RAC
MS SQL Server 2005
2008 Sybase 12.5
DataStage/Informatica
Oracle Warehouse Builder
UNIX/korn shell
Toad
MS Project TFS
Subversion
PVCS

Experience
Senior Data Architect/Solutions Architect/Tech Lead
  • Senior Data Architect/Solutions Architect/Technical Lead for TCS on site for a global financial company. Support onshore/offshore team of developers and production support staff enhancing and maintaining client’s consumer marketing database.
  • Analyze existing system architecture, document and socialize recommended architecture upgrades
  • Refactoring database and ETL for performance and maintainability
  • Participate in requirements review for Hadoop proof of concept, review results of Hadoop POC.
  • Linux VM to support gpg/pgp encryption and source code control
  • Unix to Linux conversion
  • Oracle Upgrade
  • Datacenter Migration
  • Monitor existing operations to identify, document and train team on best/worst practices for ETL, database and Unix coding issues.
  • Interview team and review existing documentation to understand the nature and scope of the system. Create context and ER diagrams to use as a tool to understand the business and verify that understanding with members of the business and technical team.
  • SCRUM advocate, socialize advantages of moving from waterfall to SCRUM with team and all levels of TCS and client management. Participate in generation of process note, and presentation to the of the proposal. Participate in the training of fellow team members in SCRUM methodology and principles.
  • Deep dive into address cleansing process and software (Experian Data Quality tools) as we migrated from Windows Server 2002 to Window Server 2012. System cleanses addresses from all countries in the world except one.
  • Implement and document configuration management environment for all database code and Unix and Windows ETL scripts.
  • Implement OneNote repository on SharePoint for capturing and distributing technical and process knowledge, meeting minutes and day to day status capture.
  • Set up test environments and synch up schemas using RedGate Tools for Oracle.
  • Design process for using RedGate tools with Subversion and teams branching methodology.
  • Design and lead team to implement generic file load process for ETL, creating and testing a Korn shell library of functions for ETL processes that could be used in multiple environments.
  • Implement multiple PL/SQL packages in support of ETL processing
  • Process logging called by Unix shell scripts and PL/SQL programs
  • A variety of standard ETL utilities including a gather stats wrapper, and partition management automation.
  • Support DBA and ETL developers with performance tuning and database issue troubleshooting.
  • Troubleshoot and support repair of AIX and Linux connectivity issues.

 

Kaiser Permanente/Datasource Consulting
Denver, CO
Senior Data Architect/Agile Team Member
  • Senior Data Architect on site for medical company building an External Encounter (Claims) Data Mart:
    • Interview team and review existing documentation to understand the nature and scope of the project.
    • Create context diagrams to use as a tool to understand the business and verify that understanding with members of the team.
    • Explore and catalog data sources. Data profile relevant data sources.
    • Analyze results of data profiles and design star schema data mart.
    • Create and maintain dimensional data models as well as source models in ERWin.
    • Participate in cross team data model reviews and knowledge transfer
  • SCRUM team member and advocate of Agile methods and SCRUM techniques
  • Implement and document configuration management environment for all database code
    • Set up development and build Oracle databases on VM server Organize all database and static data scripts into a set of build scripts to be executed or backed out with a single command.
    • Configure Jenkins Continuous Integration server to automatically exercise these scripts anytime a change is checked into Subversion and report via email if changes break the “build”
    • Support Agile BI development with rapid modifications to database design as new requirements surface or existing requirements are better understood.
    • Train fellow team members to create 0 defect database deployments
  • Implement multiple PL/SQL packages in support of Informatica ETL processing
    • Process logging called by both Informatica and other PL/SQL programs
    • Gather stats wrapper called by Informatica
    • Implement data driven table partitioning strategy
    • Data compression and decompression procedures
    • Threading support for above features on large jobs using Oracle Scheduler
    • Support DBA and ETL developers with performance tuning and database issue troubleshooting.
  • Design and support implement of test automation to support Agility
  • Participate on cross functional team knowledge transfer sessions
Comcast
Denver, CO
Senior Data and Database Engineer
  • Reporting to the principal architect provided data architecture and software engineering support for the Converge Event Management Platform (CEMP) group.
  • Reverse Engineer Data Models with Power Designer and drive their use.
  • Implement star schema reporting and ETL for the auditing portion of Real Time Entitlements project.
  • Provide project architecture support for various projects.
  • Support continuous integration/automated build project.
    • Place DDL under source code control in CVS
    • Implement full and incremental database builds to support continuous integration
    • Install and configure Jenkins continuous integration server, with Maven and Java
    • Install and use Git for source code control.
Experis Corporation
Kalamazoo, MI
Lead Technical Manager
  • Technical lead for team of 5 data warehouse consultants.
  • Reverse Engineer Data Models and drive their use.
  • Design, document and implement optimized environment for team productivity.
    • Implement source code control & change management using Subversion and RallyDev
    • Software Configuration Management practices and procedures
    • Implement full and incremental database builds to support continuous integration
    • Quality Assurance practices and procedures
    • Oracle 11.2 Data warehouse best practices
    • Oracle Enterprise Manager
    • Pentaho best practices
    • Architect for overall performance of Oracle and ETL
    • Setup Oracle Hierarchical Profiler and use it to tune existing PL/SQL packages.
Southwest Florida Water Management District
Tampa, FL
Senior Data Warehouse Consultant
  • Assumed responsibility for a mission critical Operational Data Store (ODS) for the Water Information Management System (WMIS). Reverse engineered source code for all database objects and programs and placed them under source code control in MS Team Foundation Server. Implemented scripted database deployment/build process for the ODS. Implemented branching and merging for the ODS code. Rewrote the ETL process to include enhanced process and error logging while optimizing performance and reliability. Implemented an incremental update strategy using code generated Oracle MERGE commands. Documented the process and organization of data and code for follow on developers.
  • Redesigned the ODS, new design when implemented will reduce the number of objects and lines of code by approximately 85%. New design will give near real time updates in the ODS.
  • Engaged by management to design and implement a “New Project Template” as a framework around which we standardized the organization of software development deliverables such that it supported good configuration management practices and supported scripted database deployments. Template allowed various subject matter experts to add to template so each new project thereafter incorporated the added functionality or knowledge.
  • Reverse engineered database source code for the WMIS system and organized it in source code control.
  • Designed Star Schema for all water measurements.
  • Designed and implemented a data mart for WMIS system usage versus trouble reports metrics.
  • Supported production database deployments and operations.
  • Reverse engineered and organized data models for various projects using Embarcadero’s ER Studio Data Architect.
  • As DBA and System admin for build environment, developed and documented process and code to quickly and easily create a new Oracle or MS SQL Server database environment.
  • Absolutely shredded every database performance issue given in Oracle RAC environment.
    • Reduced an ETL refresh process that was running for 6-10 hours and frequently causing production outages to a reliable less than one hour process the first week on site.
    • Reduced to millisecond response time an update trigger that had been punishing district users for 5 years with a 3-5 minute wait every time they tried to edit a PARTY record. Did it in less than 8 hours of billable time.
    • Tuned two materialized views taking 2-4 hours of runtime each refresh to 5 minutes in less than 8 hours of billable time.
  • Support and mentor other developers in good programming practices and procedures
Sedgwick CMS
Memphis, TN
Data Architect/Data Modeler
  •  Designer and Oracle Lead for the Integrated Reporting and Interfaces Datamart (IRIS)
  • Lead a team of 4 developers and created an Oracle 10g Datamart sourced from multiple source systems including a VLDB(3TB+)
  • Design and implement Star Schema
  • Responsible for performance tuning of ETL programs getting data from VLDB.
  • Tune the loading of the star schema including parallel query, bit map indexes, bit map join indexes, explain plans and access paths
  • Responsible for partitioning of Datamart tables
  • Participate in partitioning of VLDB source system
  • Design, write and deploy a global process logging application
  • Participate in an insurance web services server design and implementation for the Aon iVOS web service client. The web service client is based on the ACORD standard and uses XML, WSDL and xml schemas
  • Design an iVOS ODS schema to support multiple iVOS instances in one ODS
  • Data profile the iVOS and viaOne data using Datirus data profiling software
  • Reverse Engineer and document data models for key viaOne and iVOS tables
  • Participate in requirements gathering and reviews, wrote design documentation and participated in design and code reviews
  • Project estimation and project plan generation
  • Generate and maintain data models using heavy SAX basic automation ER/Studio 8.5
  • Code generate over 180 PL/SQL stored packages which interface to dot net.
  • Code generate ETL MS SQL data to Oracle using BCP and Oracle SQLLDR direct access mode.
  • Write a c# program to analyze 388 jasper xml report specs using a SQL parser
  • Generate and maintain DDL with associated change management and version control
  • Support Sedgwick to mature their software configuration management processes and procedures. Trained developers in the use of Subversion.
  • Design and implement a reliable, repeatable build and deploy process with Subversion, SQL Plus and shell scripts
  • Support and mentor other developers in good programming practices and procedures
  • Oracle 10G/11G PL/SQL development interfacing to .NET
Stanley Works
New Britain, CT
Data Warehouse Architect/Technical Lead
  • Tech Lead/Data Architect for global AR BI Datamart (SAP, Navision, JD Edward and ACG)
  • Generate and maintain data models for 5 new projects
  • Implement Metadata capture, storage and deployment through Business Objects
  • Responsible for all data model(Powerdesigner) and DDL changes for Stanley Data Warehouse
  • Generate and maintained DDL with associated change management
  • Implement source code control and change management (Subversion, MantisBT)
  • Create and implement release/build management system
  • Implement Subversion, Mantis and DokuWiki servers with VMWARE Jumpbox virtual appliances
  • Upgrade Powerdesigner and repository (10.2 -> 12.5)
Citigroup International
New York City, NY
Senior Data Warehouse Consultant
  • Re-architect PeopleSoft Global Data Warehouse ETL process
  • Design and build an ELT process for PeopleSoft Global Data Warehouse, prototype was estimated to be 100 times faster than production system
  • Reverse engineer physical and logical data models for PeopleSoft Global Data Warehouse
  • Troubleshoot, document and resolve performance bottlenecks
  • Participate on design reviews for projects interfacing with Global Data Warehouse
  • Research and implement adjacency list to nested set conversion which resulted in 2,400% performance improvement over PeopleSoft implementation
Systems Architect/Lead Developer
  • Re-architect RITS system (Personnel ODS) ETL-ELT Process, implement new design
  • Work with SMEs to understand current and future business requirements
  • Perform detailed source system analysis including analysis of data quality issues
  • Create “as is” data models and design future state requirements document with data models
  • Design, code, test and document ELT process for high performance and extensibility
  • Design and document process to rapidly resolve existing data quality problems
Oracle Database Architect/Developer
  • Architect Oracle portion of data delivery system for Global Compensation System interfacing to PeopleSoft HR system
  • Design and implement metadata/metrics capture system used on all data delivery, reports and out bound feeds
  • Design and develop over 100 stored procedures
  • Create and maintain system data models
  • Mentor team of 15 developers on best practices, performance tuning and debugging