‹header›
‹date/time›
Click to edit Master text styles
Second level
Third level
Fourth level
Fifth level
‹footer›
‹#›
Raytheon Goals
Process Improvement motivated a reemphasis on measures
The process improvement goal was to improve business performance
Improve how Systems Engineering is done (better products with best practices)
Improve win rates (better proposals, better predictions)
Improve productivity (better execution, less rework)
Institutionalize EIA/IS-731 “Level 3”
We looked ahead to CMMI, anticipating measures for incremental builds
SW CMM Level 3
EIA/IS-731 “Level 3” following SW by only a few months
Integrating toward CMMI Level 3 (and beyond)
We wanted measures to grew naturally from existing reports and methods
Earned value procedures and concepts were in place and accepted
Functional managers were available as advocates and agents for change
Institutional goals were addressed ahead of project-specific measurement applications
SE and SW were coordinated, but not unified, in their measurement approaches
We expected to learn lessons and apply them
Measurement History – During the course of acquisitions and mergers common measures and common processes had lost momentum.  Detailed technical measurement proposals had been developed, but were not a high priority competing with the integration of legacy operating organizations. Existing measurement procedures - Primarily CPI, SPI; looking at measuring more (via Senior Engineering Management Reviews - SEMRs) such as # requirements, risk, etc.
What site management wanted – A way to measure productivity, and to do better bids
Multiple authorities (matrix organization) – Measure discussions often brought into focus the very different goals of site management, engineering management, and project management.
We involved functional leads and engineering managers in teams to define and prioritize the measures
We pressed measurement issues gently, consistently in regular meeting until we had consensus on the most useful, collectable measures
We de-emphasized changes to project reports as rolled up to senior management levels
We emphasized better detail and collection of baseline data on work product performance, characterization, and estimation
We created duplicate collection paths to provide easy alternatives for as many circumstances as possible
We defined scoring (compliance) mechanisms to promote the adoption of the practices that were to be measured
We communicated the site SE measures widely to project engineers.  Some projects are adopting compatible measures to serve finer-grained project needs.
We collect and analyze the available data with a strong focus on our bid model
What worked well - Involvement of SE Discipline Leads (practitioner leads), addressing various users of information; leveraging existing data (SEMR charts, existing cost collection systems, etc.) What didn’t - Still don’t have a consistent WBS / cost collection structure used on every program; somewhat driven by types of business at RFC; have not made good progress in collecting some of the HR-type data (years experience, etc. - we are getting RLI-Training type data as an organization, however).  Made the measures TOO transparent to users - many aren’t aware of what’s collected, which causes its own difficulties.
Tools used - Access, Excel - looking at more sophisticated tools as we move to CMMI and an integrated set of measures
Piloting versus Big bang - Worked well to develop the measures, begin collecting with pilot programs, then broader roll-out.  Worked to update SEMR charts periodically.  Phasing in overall measures - some defined are longer term, and haven’t aggressively pursued their collection (degree levels, years experience, etc.).
We involved functional leads and engineering managers in teams to define and prioritize the measures
We pressed measurement issues gently, consistently in regular meeting until we had consensus on the most useful, collectable measures
We de-emphasized changes to project reports as rolled up to senior management levels
We emphasized better detail and collection of baseline data on work product performance, characterization, and estimation
We created duplicate collection paths to provide easy alternatives for as many circumstances as possible
We defined scoring (compliance) mechanisms to promote the adoption of the practices that were to be measured
We communicated the site SE measures widely to project engineers.  Some projects are adopting compatible measures to serve finer-grained project needs.
We collect and analyze the available data with a strong focus on our bid model
Closing the Bid Loop - Completing R6s project, and pushing a broader activity and awareness of this issue (e.g., working to kick off an R6s project on a standard bid code)
Emphasizing Simplicity
Up-Tempo Use for Projects
Supporting TPM Emphasis - Supported recent program in this in using PSM and measurement development method to develop better measures; captured approach details, will develop into a standard workshop / guide for use by other programs
Terminology/glossary - PSM provided a common terminology; helped in aligning the SW measures with ours (as they eventually aligned to the PSM structure) Project focus vs organization - We’ve focused initially more on project work products than on organizational behaviors, but we developed the structure “for the future”
Systems, software, project - As noted, the structure allowed us to align with SW and ensure we also had meaningful project data
PSM was a stable choice that met many needs -  We compared other measurement structures to see that we covered the bases (the cube model; predictive / reactive; cost, quality, cost, performance; process, product, program)