SAS® Consulting & Development

Providing Custom SAS Solutions For Over 30 Years

Updated November 19, 2019

For information regarding available services please Contact us

For a Resume of project experience Download MS Word Version

Offering - Experience - Operating Environments - Copyright & Trademarks

BP Certification Logo

Offering Up Arrow

  • BS in Computer Information Technology.
  • Senior Level Analyst/Developer with over thirty years of experience with the SAS System. Previous Security Clearance. More than 25 years of experience with SAS on the Unix and Windows platforms and 5+ years on Mainframe.
  • Advanced level of experience with the SAS System using Base Procedures, FREQ, TABULATE, MEANS, UNIVARIATE, Index Processing, Graph, Perl Regular Expressions, Macros, Data Step, DDE, SQL, RANK, EXPAND, UPLOAD/DOWNLOAD, IMPORT/EXPORT, Access to Oracle, Teradata, Netezza, DB2, SQL Server, and PC files, Output Delivery System(ODS), SCL, SAS/IntrNet, Customer Relationship Management, Campaign Management, Data Validation, Data Cleansing, Data Mining, Extracting internal/external data, using formats and informats including cleaning and validation, Reporting, Web development with SAS and custom programs generating HTML.
  • Experienced at real-time/on-line capturing and analyzing system metrics in SAS. Written programs for the following: Load file and directory information, Load user access timing, Load and analyze Web logs, Load and analyze SAS logs including the capture of timing output. Several UNIX specific programs including 'du', 'df', 'last', 'who', 'finger', and 'ps'
  • Working experience in the use of FTP, UNIX scripting and automating processes, Perl, CuteFTP, TextPad, Winzip, CSE HTML Validator, Web Browsers, Excel, Word, Access, JavaScript, VB Script, Visual Basic.
  • Using custom ETL practices we will bring together your in-house and third party data sources into a useable manageable data warehouse
  • Exchange data with Excel spreadsheets to provide real time updating for pivot tables, charts, and graphs
  • Clean and build data in support of model development
  • Create applications that build modeling sensitive variables
  • Provide a high level look at a data source by providing summary and unique values reporting for all variables
  • Develop programs using macro processing capabilities that allow for modifying execution logic
  • Create reports using ODS for presentation quality
  • Interface Email with reporting for web deployable display
  • Develop applications to your specifications using the full complement of the SAS System procedures and data step processing capabilities
  • Develop applications that access and update your third party databases
  • Mentoring for your developing staff
  • Build real time and static web pages for your reporting needs
  • Audit and update your existing code with an eye towards efficiency

Experience History Up Arrow


Project Accomplishments

Wells Fargo
(San Antonio, TX)
July 2019 – Present

Member of Customer Remediation team tasked with isolating the correct customers for the Remediation Execution team to follow through with the appropriate contact with the customers. Using both SAS 9.4 on a laptop and Enterprise Guide 7.1 running on a SASGrid server. Data sources range from Teradata, Oracle, MS/SQL Server, Flat files, and Excel spreadsheets. Data cleansing and validation are the initial tasks that are done after data sources are identified and loaded into a work space. Counts and Summaries are then created on variables of interest. The remediation plan is then followed to create datasets to be used in creating a final list of customers that will be remediated. This final list is created as an Excel spreadsheet.

Puget Sound Energy
(Bothell, WA)
October 2018 – November 2018

Tasked with managing and performing a move of all files including SAS programs and datasets from an older AIX server to a newer AIX server. The only real difference between the two servers was the version of the UNIX operating system. I installed the SAS 9.2 software that also contained CONNECT. Other team members used OS utilities to copy and restore the files. I made datasets of the directory output from each server and used PROC COMPARE to validate that the files were the same on the new server as they were on the old server.

BMO-Harris Bank
(Toronto, ON/Irving, TX)
Mostly Remote

May 2017 – October 2017

Tasked with automating three quarterly production streams (LGD, LGD_EAD, and BBC) into separate Enterprise Guide projects. Production run streams have from about six to sixteen steps consisting of data steps, Access Transpose/Pivot, SQL, and various statistical procs. The objective is to have each project running from Enterprise Guide with an initial input panel to create or select the needed variable values for execution. This has resulted in the creation of the initial panel with drop down selection values and browsing tabs to navigate to the required directories for libname allocation. The values from the dropdown panels are then put into SAS macro variables for use throughout the application.

Mid-Atlantic Permanente
(Rockville, MD)
80% Remote

June 2016 - December 2016

Automated/Modified/Migrated existing SAS V9.2 programs to SAS V9.4 so that they can be initiated by a single job that controls for error checking and does a time tracking for each step. Existing programs were using ODBC to connect to Teradata databases. Modifications made were to connect directly to Teradata. Created Enterprise Guide projects for applications that were previously scattered over several directories. Created macro variables for program directory paths to make maintenance easier. Creating charts and graphs of the summarized results of running the daily updating programs. Reviewed existing programs and made changes to improve efficiency. Developing Teradata lookup tables to be used instead using hardcoded values in the programs. Developed an update process that was macro driven to allow any table and any variables to be updated with a choice of a new date using output from the today() function as the starting point.

Bank Of America
(Boston, MA)
100% Remote

June 2015 - April 2016

Worked with other team members to develop, update, and enhance mandatory Government Compliance Reporting processes. Used Enterprise Guide on Linux based SAS Grid with SAS and SQL accessing DB2. Developed automated processes that produce exception reports that are passed to a testing team to validate the results.

JPMorgan Chase
(Columbus, OH)
25% Remote

June 2014 - December 2014

As a member of the Risk Management group we pulled raw data from the data warehouse and created modeling ready datasets. Reviewed existing code for and made improvements in run time. Indexes were created in several databases which resulted in sub-second responses that used to take from several minutes to a few hours. Reviewed SAS pass-thru processes to ensure that the target dbms server was utilized to its fullest. In some cases execution times were cut in half. Utilized hash objects to reduce the run time in some business related processing.

JPMorgan Chase
(Garden City, NY)
25% Remote

May 2013 - December 2013

Member of an award winning team that planned developed, and Implemented automated applications to replace existing manual processes that created monthly, quarterly, and annual reports consisting of multi-tabbed Excel spreadsheets. The source data consisted of production spreadsheet and SAS datasets with actual and updated auto/student loan modeling and forecasting data that are created by various members of the team. The custom ExcelXP Tagset was used to assist in creating the multi-tabbed results. PROC REPORT was used to control cell formatting and traffic-lighting based on variable content. Also a member of a team developing reports for CCAR submission. Modified PC code and uploaded to allow processing to take place on a UNIX server. Developed automated process to allow multi-tabbed excel spreadsheets to be uploaded to UNIX server. Using Enterprise Guide I developed three applications that prompt users for required processing values then executes the application.

(San Antonio, TX)
August 2012 - December 2012

I updated analytical programs in Enterprise Guide with requested user changes. Using Enterprise Guide I developed an application that prompts users to allow them to create custom views. I modified Consumer Credit, Consumer Loan, and Home Equity applications for business rule changes, third party data content and format changes.

Blue Cross Blue Shield
(Boston, MA)
25% Remote

December 2011 - August 2012

Debugged and corrected existing programs from previous vendor. Modified programs with recent business rules changes. Analyzed and fixed problems reported by users using in-house problem tracking software. Used the Netezza database appliance via SAS pass-thru to generate multiple reports. Wrote a SAS program using Perl Regular Expressions to facilitate masking selected diagnosis code from sensitive reports.

Wells Fargo Home Mortgage
(Minneapolis, MN)
(San Antonio, TX)
66% Remote
April 2010 - September 2011

Using ideas and specifications written by risk analysts I developed an analytical dataset and application to track mortgage servicing data on a daily change basis. Developed analytical datasets for registering in SAS business intelligence cubes. Developed datasets for general user access via online web links. Developed ad-hoc reports and research anomalies in data delivery. Developed DDE programs to create and update Dashboard Reporting for Management. Combine multiple formats into a SAS dataset for analysis (ETL). Created daily datasets of new workout loans and decisions made. Limited Information Map Studio development with loan based data creating targeted analysis. Used SAS Add-In for Excel for analysis and validation. Using PROC OLAP I built cubes with loan data summaries for general viewing. Created management dashboard using summaries and detail coding for automatically creating data panels. Created datasets using Piped input of daily, weekly, and monthly detail and summary.

Texas Education Agency
(Austin, TX)
Part time Remote

February 2011 - May 2011

Extract and clean data from excel spreadsheets to cross match name and address information from SAT and ACT testing agencies. Using the SAS soundex function to assist in matching names and addresses we were able to make a 95+ percent hit rate. Students were allowed one free test but in some cases registered with both agencies. Reports were created with matched student information from each of the two agencies. The agency with the earliest date and time stamp kept the registration and the other offered the student the option of paying or being removed.

SunTrust Mortgage
(Richmond, VA)
50% Remote

February 2008 - March 2010

Created UNIX accounts, directory structures, and permission levels for forecasting/modeling group. Ported existing SAS programs and datasets from Windows to UNIX making appropriate program changes. Support the forecasting/modeling effort by extracting and combining data from various data sources. Review and re-write existing code for efficiency. Implemented lookup tables to speed up processing. Loaded Census data for analysis and support of modeling. Utilized SURVEYSELECT and RANUNI for random data selection and analysis. Perform data validation and cleansing while developing analysis datasets. Develop model scoring code for historical and current data that generates tabbed Excel spreadsheets using ExcelXP Tagsets, Import/Export, DDE, and direct LIBNAME access. Create charts and graphs of variables of interest for management review. Developed a UNIX scripted automation validation process to create unique occurrence table for each dataset and aligned old and new data side by side. Created web based reports in support of model development. Created SAS/IntrNet like web based application using Apache as a proof of concept for future web development. Developing multi-platform processes to utilize both Windows and UNIX servers. Used Piped input of similarly named files from third party data feeds to create daily mortgage changes.

Nestle Purina
(St. Louis, MO)
30% Remote
January 2007 - January 2008

Member of a team that converted SAS programs and datasets to integrate with SAP. The conversion included Deleting, adding, and changing the formats of selected variables. Data validation and cleansing was done during the ETL process. Heavy use of macros, SQL, SQL Server, and base SAS programming. Utilized indexes and table lookups to speed up processing. Used simple Excel tagsets to create spreadsheets. Additional development work was performed to create a monitoring and validating process for promotional sales. Developed an automated conversion and validation process to assist in making formatting and data changes.

Railroad Retirement Board
(Chicago, IL)
May 2007 - August 2007
Remote part time)

Made modifications to PC and Mainframe SAS programs that were being transferred from IDMS to DB2. Changed Mainframe SCL programs that generated HTML output.

Wells Fargo Home Mortgage
(Des Moines, IA)
July 2006 - December 2006 (Full time)
January 2007 - March 2007
(Remote part time)

Worked on the financial data mining and data-modeling project. Created datasets in support of developing Basel II compliant models. Downloaded Census data in support of model development. Utilized ETL and Sub-Setting datasets in the analysis and generation of reports, charts and graphs. SAS procedures used include SQL, GPLOT, GCHART, FREQ, SUMMARY, SURVEYSELECT, DOWNLOAD/UPLOAD, IMPORT/EXPORT and FORMAT. Performed data cleansing during analysis, loading and validation of the data (ETL).

Boeing Shared Services Group
(Oak Ridge, TN)
90% Remote
August 2005 - June 2006

Participated in the planning and successfully executed a migration from an IBM/VM CMS system to MS/Windows Server. Used VM directory output as meta-data input to SAS code that automated the transfer of over 425 User ID's with more than 52000 files. Included were 170+ users with over 4800 SAS datasets that were CPORTed, FTPed and CIMPORTed. Modifications were made to all programs for the new environment. High profile applications were then modified to provide output in a web based form.

Wells Fargo Home Mortgage
(Des Moines, IA)
April 2004 - November 2005
10% Remote

Financial data mining/data modeling project developing Basel II compliant models. The Risk Management Project Team focused on developing “Probability of Default”, “Repurchase Exposure”, and “Loss Given Default” models to assist in identifying loan profiles that might fit into these categories. My responsibility included extracting, analyzing, transforming, and loading (ETL) datasets with basic modeling variables for each loan segment type and performing initial data cleansing and discovery reports and graphs. In addition I added several model specific variables based on requests from the modeling team. SAS procedures used included SQL, TABULATE, MEANS, FREQ, GRAPH, DOWNLOAD, UPLOAD, UNIVARIATE, RANK, EXPAND and FORMAT. data was downloaded as CSV files and transformed into FORMATS for Housing Price Indices, Historic Interest Rates, and various loan types and lengths. LGD development required several excel worksheets to be loaded into SAS datasets for conversion into Formats or Lookup tables. SAS datasets were created with indexes for fast processing of unique keyed items.

Eckerd College
(St. Petersburg, FL)
February 2004 - February 2004

This was a data recovery and reformat project. An Associate Professor at Eckerd College contacted me with an urgent plea to assist him in recovering some data that was only available in a Word document. The data had been loaded into SAS datasets and reports were generated. The data was recovered from the reports in the Word document. In addition the data needed to be formatted for import into SPSS. This was accomplished by exporting the data to an Excel spreadsheet.

(Houston, TX)
January 2004 - February 2004

As a member of the Enterprise System Group, I developed an Oracle and SAS Data Architecture for Data Mining Project. Some data was extracted from Oracle using SQL and other data from MS SQL Server. Additional data was received in Excel spreadsheets and imported into SAS. Using SAS Enterprise Miner and Clementine, a Warranty Fraud Detection System was implemented. Service Providers are periodically analyzed for any activities that might appear to be suspicious. With SAS and/or Clementine further statistical analysis is performed to determine if an audit should be scheduled with the suspect provider.

Zale Corporation
(Irving, TX)
June 2003 - November 2003

As a member of the Database Marketing team I provided SAS Consulting and Development services. Included in my responsibilities was the creation of various CRM Reports and extracting and analyzing data for Model studies. Reports developed included Campaign Management, Customer Retention and Cross-Sell/Up-Sell. With UNIX scripting the reports were developed from Oracle tables, using SQL and Data Step programs to extract data from an Epiphany CRM system on UNIX servers. Extracted data were reduced and output to Excel via DDE. Macros and Scripting were used to automate and schedule the jobs for various execution times. Model data was created using SQL to access a UNIX Oracle Data Mart. Over 250 million records were mined for demographic and life style information used for Modeling. Web based reports were created by accessing Excel spreadsheet data and creating custom HTML, JavaScript and PERL.

Entergy Resources
(Houston, TX)
August 2002 - February 2003

Member of forecasting team that developed a suite of SAS applications for analysis of energy load time forecasts. Determine the individual cost items associated with delivering energy by analysis of tariff filings. Build tariff dataset for ad-hoc and production jobs. Interfaced to Oracle database from SAS through PROC SQL and pass thru to retrieve and update forecasting results. Used SAS/Connect to remote submit jobs. Wrote SAS applications to analyze and estimate the service delivery costs for providing electricity. Wrote and maintained an application that allocated estimated new customers across billing cycles. Excel spreadsheets were created with the resultant data. Support the marketing group as Entergy gears up to compete in the deregulated energy market in Texas.

Williams Energy
(Tulsa, OK)
November 2001 - June 2002

Supported the analysis and reporting functions that surrounded the Risk Management processing. Created, modified, or enhanced three functional areas of assessing the Value at Risk or "VaR". Developed a Web based application using SAS, SAS/IntrNet, HTML and JavaScript that is an interactive method for users to do "what if" Risk Analysis processing with current holdings. Provide SAS/SCL programming support for Value at Risk Reporting System. Modify and enhance Risk Analysis GUI and programs to allow for multiple model runs. Extracted data from Oracle and Sybase with PROC SQL and SAS SQL pass thru. Create HTML mockups for new Credit VaR Reporting. Used JavaScript and PERL for dynamic data filtering, uploading and downloading.

Guidance Research
(Alpharetta, GA)
July 2001 - November 2001

Provided the programming, validation, analysis and aggregation for Direct Marketing models including Customer Segmentation Analysis, Cross Sell Modeling, and Loyalty/Retention Modeling. In addition to complex Data Step processing several SAS procedures were used including, FORMAT, FREQ, SUMMARY, and TRANSPOSE.

Just For The Kids
(Austin, TX)
June 2001 - October 2001

A non-profit organization where we loaded student test scores into a statistical model. Analysis compared similar schools based on selected demographic data. I also reviewed and remodeled some of their programs to make them more efficient. Additional responsibilities included the analysis of newly loaded datasets. Several SAS procedures were used including, FORMAT, FREQ, TABULATE, SUMMARY and TRANSPOSE. HTML output was created via ODS for the statisticians to review.

(Austin, TX)
October 2000 - May 2001

Reviewed SAS programs from different departments and combined their processing into an automated operation. Using SAS on VMS, UNIX, and Windows I wrote programs that processed data on Windows then uploaded to VMS for further processing. Once the process was completed on VMS the results were then uploaded to UNIX for the final application’s processing. Limited scripting and FTP were utilized in the processing and data movement. Wrote utility programs to allow UNIX directories to be viewed on the web. Enabled automated Email messages from processing programs that allow IT personnel to track execution progress.

(Memphis, TN)
August 2000 - September 2000

Using SAS and SAS web software developed a fuel cost monitoring system. Cleaned up and streamlined several SAS programs that were generating HTML. Added analysis and reporting options to existing menus.

SAS Institute
(Dallas, TX)
April 2000 - July 2000

Worked with SAS Development team to create a SAS/Oracle based Data Warehouse application for Sprint. Was member of QA team that reviewed programs for standards adherence and maintainability. Performed unit testing of Data Step and SCL programs. Wrote Oracle SQL scripts to extract data from warehouse to facilitate content validation. Used base SAS procedures FREQ, TABULATE, and MEANS to validate current results with expected results.

Eli Lilly & Company
(Indianapolis, IN)
October 1999 - April 2000

Worked with Regulatory IT team to create a SAS/Oracle/Intranet based application for Pharmacovigilence research. The SAS based application extracts Adverse Event information from Clinical Trial data. It then creates early warning drug symptoms tracking and frequency analysis. The Intranet Web access allows various selection criteria as well as comparison with FDA data.

Tennessee Valley Authority
(Norris, TN)
August 1999 - October 1999

Web enabled existing applications. Using Visual Basic along with MapObjects and Internet Map Server software from ESRI, existing applications were re-engineered to run from the Web.

(Greenbelt, MD)
November 1998 - July 1999

Provide SAS programming support and mentoring for a new staff. Offer assistance in system design and implementation issues. Convert programs from reading legacy flat files to accept input from SAS datasets. Develop extract processes to move data from a Mainframe environment to Unix and Windows NT.

Tonn & Associates
(Austin, TX)
April 1998 - October 1998

Provided forensic data mining and analysis in litigation support for the Oklahoma vs. The Tobacco Industry case. State Medicare records for 1993 were loaded into SAS datasets. Several SAS load and analytical procedures were written. An emphasis was made to provide the output via the World Wide Web. Macros available from SAS Institute were utilized along with custom code, to publish the results of reports with embedded HTML codes.

Tonn & Associates
(Austin, TX)
March 1997 - January 1998

Provided forensic data mining and analysis in litigation support for the Texas vs. The Tobacco Industry case. Several different sources of financial, demographic, and medical data were loaded into SAS datasets for analysis. Those included NMES-1, NMES-2, MEPS, and BRFSS, Several SAS load programs analytical procedures were written.

VRC Corporation
(Alexandria, VA
Mt Laurel, NJ)
May 1995 - April 1998

Provided analytical and SAS programming support for "JCALS”, A Joint Services test to evaluate the ability of a networked system of Computers to effectively operate in support of the requirement to Acquire, Manage, Update, Publish, Warehouse and Distribute documentation electronically. Member of a team that is taking leading edge steps to utilize Operating System generated data (Automated Data) for performance evaluation. Instrumental in discovering the availability of automated data. Devised methodology to capture and transfer collected data for evaluation purposes. Several SAS programs were written to load the collected raw data into databases. Several SAS procedures were written to analyze the data to produce reports.

VRC Corporation
(Fort Bragg, NC
Fort Knox, KY)
August 1994 - May 1995

Provided analytical and SAS programming support for the CTASC-II IOTE, a United States Army test to evaluate the ability of a networked system of Mini-computers, Workstations, Desktop PC's, and Laptop PC's to effectively operate in the support of Army Supply Requirements. Used macros and base SAS procedures with data step programming to provide analytical reports. Provided the first glimpse into using Operating System and Application generated audit data for performance evaluation.

Provided analytical and SAS programming support for the A2ATD, a U. S. Army test to evaluate the effectiveness of using simulators to simulate tanks in an overall operational environment. Used macros and base SAS procedures and SAS/GRAPH to produce comparative results.

Chevron Petroleum
(Houston, TX)
June 1994 - July 1994

Create specifications for the development of an Arcinfo and an Arcview application that supports Exploration activities. Used SAS to prepare spatial data from sixteen different sources form merging into the Arcview system. Implement the design using Arcinfo AML's and creating Arcview "views" for casual user browsing and evaluation of prospective exploration opportunities.

Coleman Research Corp
(Fort Hood, TX)
October 1993 - May 1994

Provided analytical and SAS programming support for the M1A2 IOTE to the United States Army Operational Test and Evaluation Command. Extensive Data Step programming with base SAS procedures and SAS/Graph to analyze and graphically display communications and navigational improvements to M1A2 tanks. Used Macros to help develop programs with multiple purposes. Assist Junior and Mid-level programmers with SAS related questions and problems.

Shell Oil
(Houston, TX)
April 1993 - July 1993

Provided maintenance programming services for System 2000 using Cobol and running on IBM/CMS to extract information from existing databases to export to another system. Performed maintenance programming for several System 2000/Cobol database access programs. Wrote and modified REXX Exec's to incorporate new features added during maintenance programming.

Mobil Oil
(Houston, TX)
March 1985 - April 1993

Wrote SAS programs and SAS/GRAPH procedures on Workstations and PC's to load and perform competitive analysis on worldwide petroleum licensing, field production, company participation and well production history information.

Participated in the design and development of Worldwide GIS database using Arcinfo on Sun Sparc Workstations. Extensive use of AML's in application development including displaying menus and the creation of reports. Converted data in digital and flat file format from various vendor and in-house formats into Arcinfo coverages. Used ARCEDIT to make coverages "production ready". Provide liaison between Mobil and ESRI (Vendor of Arcinfo). Installed and created several GIS applications using Arcview on Sun Sparc Workstations. Provided user support for Arcview on PC's and Sun Sparc Workstations.

Provide limited system administration function on Sun Sparc Workstations. Maintaining Unix Operating System and Application software. Kernel generation for adding/deleting features. Monitor machines in a network environment to insure active status.

Maintain and support Worldwide GIS (Petroconsultants) Wells, Fields, Concessions and Operator data using Arc/Info on Sun Workstations and on IBM Mainframes running MVS/TSO using Model 204 User Language.

Designed and implemented a Project tracking system using Model 204 User Language on IBM running MVS/TSO, that tracks information related to exploration ventures. Assisted in the redesign and conversion of Petroleum related databases from System 2000 on Cyber computers to Model 204 on IBM Mainframes running MVS/TSO. Designed and implemented a system for the use of Petroleum related data files on the Cyber computer using System 2000 and FORTRAN. Provided user support for in-house usage of System 2000 on Cyber computers. Designed and implemented a System 2000 database tracking well location data received from Tobin and IOSA.

Houston Light and Power
(Houston, TX)
August 1985

Analyzed client complaint of excessive amounts of execution time for System 2000 functions running on an IBM Mainframe in an MVS/TSO environment. Recommended, then implemented changes necessary to bring performance to an acceptable level.

Pecten International
(Houston, TX)
March 1984 - April 1987

Designed and implemented a series of System 2000 databases that contained Petroconsultants Well, Field, Concessions, and Operator data on an IBM Mainframes running VM/CMS using FORTRAN, ISPF and the System 2000 Report Writer.

Designed and implemented a System 2000 database containing petroleum licensing data with financial and operational petroleum information on an IBM Mainframe running VM/CMS using Cobol, Fortran, ISPF and the System 2000 Report Writer.

Petroconsultants, Inc.
(Houston, TX
Geneva Switzerland)
November 1982

Consulted with client and vendor of timeshare database access products regarding the benefits of making client data available via vendors dial-up computer network.

Shell Oil
(Houston, TX)
September 1982 - April 1987

Designed and implemented a System 2000 database containing historical well information on an IBM Mainframe running VM/CMS using FORTRAN, ISPF and the System 2000 Report Writer. Participated in the development of a System 2000 database on an IBM Mainframe running VM/CMS, to contain a unified schema of all exploration and production activities. My portion of the schema was oriented towards the handling of Reservoir Engineering data. Implementation was made using Cobol, Fortran and ISPF.

Tennessee Valley Authority
(Muscle Shoals, AL)
August 1982

Participated in the development of an ISPF interface for System 2000 databases running on an IBM Mainframe in an MVS/TSO environment. Interface allowed for data entry and reporting for a system that monitored hazardous material generation/storage/disposal.

Shell Oil
(Houston, TX)
December 1980 - July 1982

Participated in the evaluation of System 2000 vs Adabas on IBM Mainframes running VM/CMS. Wrote benchmark ad-hoc queries, report writer and COBOL host language programs for both System 2000 and Adabas.

Designed and implemented a System 2000 database containing Petroleum Information Systems Well History data on an IBM Mainframe running VM/CMS using FORTRAN and ISPF.

Church's Chicken
(San Antonio, TX)
November 1980

Performed an audit of System 2000/COBOL programs and Report Writer programs to evaluate their effectiveness. Changes were recommended to client to improve productivity.

General Telephone and Electronics
(Tampa, FL
Stamford, CN)
December 1979

With System 2000 as the DBMS, I consulted in the design of an equipment storage database and assisted in the creation of programs to allow the re-use of the equipment.

Tennessee Valley Authority
(Muscle Shoals, AL)
November 1979 - September 1980

Redesigned and implemented a System 2000/COBOL database running on an IBM Mainframe in an MVS/TSO environment. The database was used in the monitoring of the radiation exposure of personnel working in Nuclear power plants. COBOL and the System 2000 Report Writer were used to generate some required Government reports.

Designed and implemented a System 2000/COBOL database running on an IBM Mainframe in an MVS/TSO environment. The database was used in the monitoring of the quality of air and water within the area of power generating plants. Several System 2000 Report Writer programs were written for reporting purposes.

Gulf Oil
(Harmerville, PA)
October 1979 - November 1979

Consulted in the design of and implementation of a System 2000 database running on a Univac 1108. Seismic data was loaded using System 2000/FORTRAN. An application was developed to allow users to selectively generate maps.

Tennessee Valley Authority
(Muscle Shoals, AL)
August 1979 - October 1979

Advise client personnel on design tradeoffs, review client designs, direct programming effort for implementing client designed databases.

Associated Milk Producers, Inc.
(San Antonio, TX)
July 1979

Consulted in the design of several System 2000 databases running on an IBM Mainframe under the DOS environment. The databases were used in controlling the collection, manufacturing, and distribution of milk products.

Petroleos Mexicanos
(Mexico City)
January 1979 - June 1979

Participated in the design of a production monitoring system. Consulted in the design of a transportation and sales distribution system. The PEMEX databases were implemented using System 2000/FORTRAN running on Cyber computers.

MRI Systems
(Austin, TX
Mexico City)
February 1975 - December 1978

Provided "hot-line" support for users of System 2000. Installed System 2000 on Control Data (Cyber) and Univac (Unisys) systems. Wrote quality assurance and installation verification programs for System 2000 using its Natural Language and Report Writer features and also using COBOL, FORTRAN, and PL/1 host language interfaces. Provided technical marketing support for System 2000 in Mexico. Supported initial MRI entry into Mexican market. Provided technical marketing support to the MRI representative company Software Internacional.

Operating Environments Up Arrow




  • Personal Computers
  • Windows (3.0 - 10)
  • Linux
  • Mainframes
  • VM/CMS
  • VMS
  • UNIX Workstations
  • Sun
  • HP
  • IBM
  • Dec
  • SAS V6 - V9
  • Base
  • Graph
  • Macros
  • Enterprise Guide
  • Enterprise Miner
  • SCL
  • SQL
  • Data Step
  • DDE
  • Formats
  • SAS/IntrNet
  • ESRI Products
  • ArcInfo
  • Arcview
  • Utility Software
  • CuteFTP
  • CSE HTML Validator
  • TextPad
  • ExamDiff
  • Microsoft Products
  • Internet Explorer
  • Excel
  • Word
  • Outlook
  • Powerpoint
  • Databases
  • Oracle
  • Teradata
  • Netezza
  • DB2
  • System 2000
  • MS Sequel Server
  • Access
  • HTML
  • Perl
  • SQL
  • JavaScript
  • Windows Scripting
  • Web CGI
  • Visual Basic
  • Fortran
  • Cobol

Copyright & Trademarks Up Arrow

SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries.
® indicates USA registration.

Back To Top