Secret of Rocket Science: Getting the Details Right
“This is not rocket science…” How many times have you heard that expression? In general, that statement is used to indicate that whatever you’re doing is not overly complex. It’s a tribute to the perceived complexity of rocket science. But just what is rocket science? Is it some arcane form of engineering that doesn’t relate to the things done in the commercial world? Or is there more to it? And, more importantly, can rocket science be relevant in today’s fast-paced market?
Dictionary.com’s first definition of rocket science is “rocketry” (English teachers used to scream at me for using different forms of the word in the definition but dictionaries seem to get away with it). Rocketry, in turn, is defined as “the science of rocket design, development, and flight.” The website’s second definition of rocket science is “something requiring great intelligence, especially mathematical ability.” So, on the surface, it appears rocket science is just that, the science of building and launching rockets with a nod toward things being complicated. Neither of those definitions satisfies me. Based on my experience in the industry, I believe they are incomplete. Only when you get into the nitty-gritty of a space launch does the essence of rocket science become clear. Rocket science is all about getting the details right.
With a space launch there are no second chances. There are no do-overs. If the launch fails, that’s it. A billion dollars may end up in the ocean, or scattered in pieces around the launch pad, or in a useless orbit around the Earth. No second chances. Once in space you can’t bring your malfunctioning satellite or probe into a local garage for repairs. You build in redundancies when you can and work to reduce the risk as much as you can.
The launch vehicle and its payload combined have hundreds of thousands of parts, subassemblies, and assemblies that must all work and function together for success to occur. A system with a million parts and 99.99% reliability can still exhibit a hundred malfunctions, of which any one may lead to a catastrophic failure. So the emphasis in the commercial space launch world is on the details. You have to get them all right. So when it comes down to it, rocket science is really the science of managing millions of details, while also working to bring operational risk down.
How does that relate to you? Your project of replacing a machine on your assembly line, or developing a new drug, or testing your electronics package certainly doesn’t involve millions of details. True, but it still may entail hundreds, or even thousands, of interrelated tasks, components, and tests or inspections when you add up everything that has to be done. These are the details you must account for in what you do. Furthermore, there may be one detail that is ignored because your team members believe someone else must be paying attention to it. You find this out after it rises up and bites you in the butt. Or there may be a detail you just didn’t think of. There’s a reason why project management is one of the core competencies in rocket science. But it’s far from the only one.
Every component, subassembly, and assembly used in a rocket launch is either analyzed or tested to determine its suitability for use during the launch or on the payload. For many projects and products this idea of analyzing and testing everything may seem like overkill and too expensive and time consuming. In the commercial world it probably is. Until you have a problem. Let me give you a real life past example. An automobile OEM supplier couldn’t seem to get its electronics to pass its shock or vibration testing. Since the electronics were packaged the way it was done previously they were confident in the design. However, to be on the safe side, because this new version was slightly different in size and shape, they decided to run some tests. They used the same fixture they always used that never had a problem. They made what seemed like a very slight modification to that fixture to accommodate new mounting holes. Yet the parts failed.
In discussions with them, the question arose whether it was the actual electronic hardware that failed or whether it was something in the test set-up causing the failure. They didn’t have the capability to run the analysis to know whether the fixture, the environment, or the part design was the cause. My company had the capability. Our engineer on the project had done this sort of analysis countless times. We proved the fixture was resonating, adding higher loads than the electronic components would see in real life. We helped them redesign the fixture and their parts passed.
Rocket Science Technologies, Inc. has the knowledge and experience to help you in situations like this. We have the experience to guide you through this kind of failure recovery in an efficient manner to find a solution. We can also help you plan your next new product design and development to help avoid these kinds of issues. We can help you corral those details so the risk of failure or issues is significantly decreased. We can’t guarantee success, but we can improve the chances dramatically. And, in the case of something going wrong, we can help you get back on track and recover.
Rocket Science Technologies, Inc., has gathered engineers, physicists, mathematicians, and project managers as associates, available on an as-needed basis, to add their know-how to help you get past even the most challenging technical obstacles. All of RST’s personnel have shown through their careers a propensity for taking on tough problems and solving them. We relish solving technical challenges. We also understand the needs of the commercial market for reduced costs and higher velocity to production and market. Check us out at http://rocketscitech.com.
Where Has The Trillion Dollars Gone… and Lessons from Albert
We cannot solve our problems with the same thinking we used when we created them. — Albert Einstein
A few days ago I was doing my usual finger exercises with the cable remote when I stumbled on a Charlie Rose interview with Larry Summers, the former Clinton Treasury Secretary and former Head of Obama’s Economic Council. I listened for a few moments and was about to continue with my finger exercise when something Summers said caught my attention. If you were to look at income distribution (i.e., how income is divided among the population) in the United States, and compare the division now to what it was in 1979 you would find something very troubling. If the distribution today was the same as it was in 1979, 80% of the population (mainly the middle classes) would have $1trillion dollars more than it currently has, and the top 1% would have $1trillion less. This comes to about $11,000 per family for the 80%. In discussing the causes of a slow-growth economy and income inequality, Dr. Summers pointed to the lack of demand as the cause, not lack of supply. And, yet, since the time of Ronald Raegan, the emphasis on the government’ response to the economy and its problems has increasingly been on the supply side of the economic equation, with neglect of the demand side.
A prominent economic theory in politics today seems to be that tax cuts for the rich and for corporations are the only way to stimulate the economy. The old concept of the so-called law of supply and demand seems to have been displaced by new schools of economics discounting the demand side as being unimportant. In fact, it is argued that anything done to help the demand side will negatively impact the economy. For example, think about the arguments against increasing the minimum wage. Today the focus is on supply side of economy, i.e., the rich and corporations, to the neglect of the demand side. We do this in spite of the acknowledged fact that the economy is 73% consumer driven. Where does consumer spending fall? On the demand side of the equation.
Why the focus on one side? How did this situation arise? Part of it, has to do with taxes. Everyone hates them. So a theory that purports to improve everyone’s wealth by cutting taxes is very appealing. In other words, it appears elegant. What does it mean for a theory to be elegant? In the vernacular, it means it’s “cool.” It also means it’s simple.
In physics there was an argument over the duality of a physical entity. Is light a wave or a particle? Big names in physics were divided on each side of the argument. Isaac Newton was the biggest proponent of the particle approach with his so-called corpuscle theory. On the other side were Rene Descartes, Robert Hooke, and Christiaan Huygens, all well-known physicists and mathematicians in their own right.
The apparent nail in the coffin to the particle theory came in the mid-19th century, when in 1865, James Clerk Maxwell, the brilliant Sottish mathematician and physicist published a series of equations known, not surprisingly, as Maxwell’s equations. These equations described light as a wave made up of electric and magnetic fields, the so-called electromagnetic waves. Not only were the equations extremely elegant, but they seemed to explain all the aspects of light, such as refraction, diffraction, reflection, etc. Within a decade or so Maxwell’s equations had been anointed as the as the answer to the centuries-old argument by virtually every physicist. Elegance and simplicity (at least to a physicist or mathematician), just like supply side economics.
Then a strange thing happened. While conducting experiments to further verify Maxwell’s laws, Heinrich Hertz accidentally discovered that light can stimulate metals to emit electrons, the so-called photoelectric effect. A seemingly small and unimportant discovery that was to change the world. Solar cells are a prime example of the application of the photoelectric effect. They produce electricity when exposed to sunlight. Not only did Maxwell’s Equations not predict the effect, but when applied in the right manner, they predicted the wrong answer. According to Maxwell’s Equations, varying both the wavelength (we see it as color) and the intensity of the light would change the rate of electron emission. In other words, shine a brighter light on the metal and more electrons should be emitted. Only that isn’t what occurs. Instead, the electrons only respond to the frequency of the light, not the intensity. There was a threshold frequency for each metal below which electrons were not emitted.
So the Maxwell slam dunk was suddenly derailed. His equations no longer described every aspect of light. For the next two decades, physicists searched for an explanation. It took Albert Einstein to provide the solution. His solution required light to act as a particle with a discrete energy based on the light’s frequency (color). But Einstein went a step further. He argued that light was both a wave and a particle and could act in either sense depending on the application. In other words, light consisted of photons that also acted as a wave. Einstein was awarded the Nobel Prize in 1921 for the photoelectric effect. His 1905 paper on the photoelectric effect started a revolution in physics that eventually led to the Quantum Theory (which, in turn, led to inventions such as lasers and electronic semiconductors.) So, in reality, both sides of the argument of the nature of light were right. In some cases you could use the Maxwell’s wave equations and would be correct. In other instances, you needed to use the particle aspect and Quantum Theory.
I think there’s a lesson here that applies to economics. There are two sides to economics, supply and demand. Ignore either side at your own risk. Ignore the fact that the US economy is 73% consumer driven and let the middle class fade away, and then see what happens. Let income inequality continue to expand, and then tell me who’s left to purchase the supply? The laws governing economics are two-sided. Nature tends to prefer equilibrium, i.e., a balance between two forces. For example, our sun operates as a balance, as an equilibrium between the heat generated by nuclear fusion in its core, and its massive gravity, which compresses the hydrogen in the core sufficiently to create the high temperatures required for fusion. When equilibrium is broken in nature, the effect is usually catastrophic. For the sun, when the hydrogen is expended, the equilibrium will break down and the sun will expand into a red giant, ultimately engulfing the Earth. (Don’t worry, that won’t happen for another four billion years.)
For the last three years US corporate profits have been the highest they’ve been in a long time but with minimal job creation (when compared to the increase in profits) and certainly no wage growth for the middle class. One company reported its highest profit in history and still continued to lay people off. We hear the same old arguments. Cut taxes and cut the budget. Forgo investments in our infrastructure. Keep wages suppressed. We’ve seen some recovery but not nearly what should expect at this time after a deep recession/mild depression.
We have a couple of on-going experiments now occurring that should shed some light on this (please excuse the pun). Kansas and Wisconsin. In Kansas we’ve had an extreme case of tax cuts for the wealthy. State budget deficits abound and the economy is lagging the nationwide in its recovery. In Wisconsin, we’ve had a less extreme but still an energetic application of the supply side-only application. When both are compared to Minnesota, which had a more balanced approach, they are significantly lagging Minnesota’s growth numbers. Are these the equivalent in economics to the photoelectric effect? Is it time for a more balanced theory and approach?
As Einstein said, “We cannot solve our problems with the same thinking we used when we created them.” I also think of another more-known Einstein quote: “Insanity: doing the same thing over and over again and expecting different results.”
We need a more balanced approach, one that accounts for both supply and demand. We need a change in thinking. Albert, where are you when we need you?
A Tool for Fiction Writers Not To Be Overlooked
I just finished the draft of my second novel and now face the dread chore of editing. At best editing is a chore, at worst a nightmare. Time and time again, self-published authors are chastised for their lack of or poor editing. Likewise, agents and publishers gripe about the poorly edited manuscripts they receive. Readers complain how reading a poorly edited book can be distracting. I sometimes find myself re-reading a sentence in a purchased e-book because it didn’t make sense or was so poorly structured. I have put books down when errors made it too difficult to read.
Why don’t writers edit their book? I guess there may be some who think their writing is too good, that they don’t need to edit other than maybe an MS Word spellcheck. Some don’t even think they require a spellcheck. For others, it’s a simply a chore they don’t want to deal with. For the many, if not most, it comes down to a matter of cost. A professionally edited 80,000 word novel, depending on the kind of editing, can cost from hundreds to thousands of dollars.
So writers either don’t edit or they try to do it themselves. As it turns out, most authors are simply not good editors for a number of reasons. First, a writer author has too much invested in what he or she has written. The forest for the trees sort of thing, as well as an inherent bias. Second, editors have a knowledge and experience base that many authors don’t have. There are self-help books but even if you gain an understanding in the knowledge of the basics, you don’t have the experience to apply it correctly. And, last, the talents required for editing are very different from those for writing. The freedom of creativity for writing and the discipline for detail involved in editing are often at odds. There are a few people who might be fortunate enough to have both but for most theory talent lies in one or the other.
So what is one to do? The costs associated with releasing a professional looking book can add up, i.e., cover art, professional formatting, a website, editing, etc. Even for someone who is going the traditional route, editing alone can be expensive. You could do, as I did, and marry your editor. Actually, I didn’t find out about her editing skills until after we were married. In truth, she’s not a trained professional editor. She is a fair facsimile with a BA in literature and an MS in library science. She’ll catch most of the errors but may not be as nuanced as a professional would be in things like point of view. However, her editing tightens up my work and provides an outside look at the novel flow and the characters.
I have found a tool online which makes her life easier and will catch many basic errors in writing, and significantly improves up my writing. It’s called AutoCrit and can be found at http://AutoCrit.com. First, let me say I have no connection to AutoCrit other than as a user. Second, I’m not advocating AutoCrit as a replacement for a professional editor. However, in my humble opinion, it does a creditable job of the strengthening a manuscript. AutoCrit is a subscription service starting at $5/mo for a basic package allowing you to edit 1000 words at a time, a platinum package at $8/mo allowing 8000 words at a time and a professional package with no word limitations and a clever add-on that allows you to compare your writing to a norm for your particular genre. For me, I find this feature neat because as a science fiction writer engaged in detailed world-building I find myself in a constant battle with the so-called passive voice and the use of, to put it mildly, uncommon words. The AutoCrit genre add-on is still in its infancy and has its limitations. However, they’re working to expand its capabilities.
AutoCrit allows you to address many of the basic ills of writing. Your working tool is a dashboard with headings: Home, Pacing and Momentum, Dialog, Strong Writing, Word Choice, Repetition, and Compare to Fiction. Home gives you a Summary. More on that later. Pacing and Momentum has two subheadings: Sentence Variation and Pacing. Dialog involves Dialog tags and adverb usage (he pontificated warmly). Strong Writing involves more Adverb Usage, Passive Voice, Showing vs. Telling, etc. I won’t bore you with a complete listing.
AutoCrit allows you to directly upload a Word document but I found that capability of limited use because Word idiosyncrasies force AutoCrit to strip out all of the special formatting such as bold and italics. I write in the third person subjective with lots of italics so it would be painful for me to have to go through and reformat the corrected section or chapter once I downloaded back to my computer.
Instead, I’ve worked out my own methodology. I use the cut and paste function and then let AutoCrit do its thing. The home page gives you a summary of most of the major issues. If, for example, you click on repeated words, AutoCrit highlights them. I use that as a guide and place the two documents side by side and work my way through the suggested corrections for each category. AutoCrit provides a comparison of your writing to published fiction such as “ly” adverbs, passive words such as “was” and “had”, sentences beginning with an “ing” word, and many others. The ratings of these individual categories go from “great” to “excess.” AutoCrit will also suggest how many changes of a particular correction you need to make. For example, AutoCrit may point out 36 “had” and recommend eliminating twenty.
I work my way through AutoCrit, using the highlighted words to identify suggested changes until everything is rated good or excellent (with the emphasis on excellent) on the Home page.
My 4000-word Chapter One of my second novel was reduced by more than 300 words after using AutoCrit. I deem it’s now ready for my wife or a professional editor to review. Is it ready for publication without an editor after I use AutoCrit all the chapters? In my mind, no. I believe I still need the professional touch and an outside view. I am also having three beta readings before I complete the AutoCrit. I’ve asked the beta readers to look at plot, characterization, pace, and consistency, before I spend the time using AutoCrit and editors to work on the book.
My process for editing my novel, in order, involves the following process:
- MS Word spellcheck with some grammar check,
- beta readings,
- AutoCrit by me,
- editing by wife, and
- a final top-level read by a professional editor who provides this for a reasonable fee.
What if I weren’t married to my editor and couldn’t afford the professional editor for a final look? There are alternatives such as author working groups that may provide some help in this area. Still, I’d say my book is infinitely better by using AutoCrit than without it. At least, it’s a step in the right direction. If you decide you can’t or won’t pay for a professional editor, then AutoCrit and other software packages like it are certainly better than doing nothing.
Will the National Security Council Spearhead Government Effort to Combat Antibiotics Resistance by New Superbugs?
The President’s Council of Advisors On Science And Technology (PCAST) issued a report in September to the president on the increasing resistance of killer bacteria to existing antibiotics and the threat it poses to the United States and the rest of the world. The report received minimum fanfare but was referenced in today’s NY Times article “Superbugs Kill India’s Babies and Pose an Overseas Threat” concerning the increase in antibiotic-resistant bacteria in India and the threat this poses to the rest of the world. The PCAST report reviews the growth of this problem of drug-resistant bacteria and offers potential solutions.
The PCAST report starts off with us imagining a world without antibiotics, like it was at the turn of the 20th century when “…as many as nine women out of every 1,000 who gave birth died, 40 percent from sepsis. In some cities as many as 30 percent of children died before their first birthday. One of every nine people who developed a serious skin infection died, even from something as simple as a scrape or an insect bite…” And the list goes on. We’ve grown accustomed to having antibiotics at our beck and call. Infection? Run to the doctor and get a shot. Magically you’re cured. Only that’s now changing. In India, according to the Times article, as many as 58,000 babies die annually from infections caused by these so-called superbugs which thrive in India’s sewers, rivers, and people due to poor sanitation. While this is only a fraction of the infant deaths in India, the number is increasing dramatically every year. More importantly, now some of these bugs have migrated to Europe and the US to join those that we already have here. Furthermore, these new so-called superbugs are virtually immune to all existing antibiotics.
This issue of the growing number of so-called superbugs is not new but it’s rising in importance. According to the PCAST report the CDC estimates the cost to our economy of the health care related to these infections at $20-35B. This will only go up as more of these superbugs appear and become more common. This issue has arisen from the overuse of antibiotics in humans and in agriculture. Furthermore, there are few new antibiotics in the pipeline because of the difficulty in creating these drugs, the long and expensive development and testing process required , and the lower profit numbers associated with antibiotics.
The limitation in the profit making ability of antibiotics is not to be taken lightly. Most of the superstar drugs today are lifestyle drugs in the sense that they don’t cure you but they control symptoms over your lifetime. They’re medicine you’re likely to be on for the rest of your life. A recent Tuft study estimated the cost of bringing a drug to market at more than $2.5BLifestyle drug development costs can be recovered over years, as opposed to drugs like antibiotics which involve a short twenty or thirty pill prescription and you’re done. We can see the impact of the costs for short run drugs in lifesaving cancer drugs that are taken for a relatively short time by a limited number of people and cost hundreds of thousands per year. In contrast, the “lifestyle” drugs taken over many years, e.g., cholesterol control, heart, and arthritis drugs, cost in the $5000/yr range. The expenditures for developing both types of drug are relatively similar. It’s the number of people using them and the length of time they’re used that result in the different prices to the user. There is some concern that the development and clinical costs for antibiotics may be still higher, making it even more difficult to recover costs.
The PCAST report recommends appointing a member of the National Security Council as Director for National Antibiotic Resistance Policy (DNARP), who would report to the president to help coordinate a top-level government-wide Task Force on Combating Antibiotic Resistant Bacteria (TF-CARB) that is co-chaired by the Secretaries of Agriculture, Defense, and Health and Human Services. They will be tasked with developing steps to address the antibiotic-resistant bacteria that include:
- Expanding the surveillance of antibiotic use. The report indicates that 50% of antibiotic prescriptions are not needed and that is “a major contributor to rising antibiotic resistance.” This effort will include funding support for improved data gathering by local public health organizations to report on the use of antibiotics and to gain better data on the scope of the problem.
- Increasing the longevity of existing drugs by better managing their use, addressing outbreaks, and working to reduce the growth in antibiotic-resistant organisms. This includes addressing the issue of the use of antibiotics in agriculture.
- Increasing the rate at which new drugs to combat these infections are developed. This includes adding additional direct federal funding to support R&D, using non-traditional organizations such as DARPA (the Defense Advanced Projects Agency), addressing the costs and time required to approve these drugs, creating new economic incentives to pharmaceutical companies, and offering prizes (similar to the X prize) for new diagnostics.
Some may feel this is another power grab by the government and that it can be simply addressed by market forces. Well, the fact is, the market hasn’t responded yet because of the cost and profit issues mentioned earlier. The PCAST working group included members of the agricultural, biotech, and pharmaceutical industries from such organizations as AstraZeneca, Iroko Pharmaceuticals, GSK, Norvartis, and Smithfield Foods. To wait for the market to respond will be too late. The reduced effectiveness of antibiotics is rapidly becoming a major public health issue that will eventually impact all of us, especially our children and grandchildren. Hopefully, this effort will not be stymied by the rancor and partisanship in Washington.
On Pie-in-the-Sky and Commercial Space
Last week two of the so-called commercial space vehicles under development failed in fiery splendor. To some it’s vindication that the “NASA” way is correct. To others it displays the hubris of the billionaires funding these vehicles. Others wonder why we’re even wasting our time with this stuff when there are so many other problems in the world.
Some may think what I’m about to write as corny. Other might see it as far-fetched and pie-in-the-sky (or worse). That’s okay. Because I suspect there will be others who’ll get it. Why go into space? Why spend all the money and time and risk? Why try to cut the cost of space travel?
Why go into space? Because it’s there. Humans have always been a race of explorers. Or at least some of us have been explorers. That’s what drove us up the road to civilization. Not everyone is an explorer. When the American West was opened up in the 19th century, some elected to follow the exhortation to “Go West”, while many others elected to stay in their cities on the East Coast or on their family farms. As I said that’s okay. There should be room for both. When aviation began in its early years during the 1920s some elected to fly with the barnstormers while others elected to remain on the ground and watch. That’s okay, too. Now virtually everyone flies.
Why go into space and why spend all the money and time and risk? To ensure the survival of the human race. As long as we’re stuck on one world we’re vulnerable to destruction. It may be a natural disaster. An asteroid. Or the “ring of fire” volcanoes suddenly erupting. Or simply a plague. It could be human induced climate change. Or maybe a war. If we had substantial settlements on other worlds or in space the human race would survive. Pie-in-the-sky and far-fetched? Maybe. But it’s the pursuit of those pie-in-the-sky and far-fetched dreams that brought us to civilization.
What about the risk? No one is forcing astronauts to fly into space. No one is forcing those who h bought tickets on Virgin Galactic to purchase those tickets. Beyond that, risk is the price of advancement. Remember risk vs. reward. Or that old saying, “Nothing ventured, nothing gained.” Advancement comes at a price, and there are those who are willing to pay that price. There are those who aren’t. That’s fine, too. As for risk, what we’re seeing now is not any different than in the early days of aviation when crashes were far more common. Or even in the early days of the space program. Remember (and I’m dating myself) all those Vanguards blowing up on the launch pad before we finally got a Redstone to work and put the first US satellite into orbit? Remember Apollo One and the three astronauts who died in the fire?
Space is full of natural resources. Solar radiation which can kill is also a source of energy. Ice is plentiful. So is hydrogen. Those we know. We suspect that some of the asteroids may be rich in metals. Pie-in the sky? Maybe. But so was the transcontinental railroad. Or building a plantation in the wilds of Virginia in the 1600s. Some scoff. Some take action. History is full of people who say we can’t or shouldn’t. Fortunately, history of full of those who ignore the nay-sayers.
Why cut the cost of going into space in the face of huge risks? The American West was not really opened up until the transcontinental railroad was built. Airlines weren’t really successful until the DC-3 and later airliners cut the cost and time to travel (as well as improved reliability). Airlines really took off when the airport infrastructure was built. Even the automobile wasn’t going anywhere (excuse the pun) as a means of mass transportation until Henry Ford built the Model T for everyone.
What about those billionaires? If they are driven by ego so what? Isn’t that the definition of an entrepreneur? Someone who is so sure of what they have to do and who may be willing to risk everything. Some say they’re just playing. Well, I guess so were those rich British aristocrats and merchants who funded that plantation in Virginia in the 1600s. So, to Elon Musk and Richard Branson and the others, you have the money and the will. More power to you. Do it. It’s how we got here.
Note: Part of this appeared as a comment on a NY Times article.
Project Management as a Cornerstone of Success for Small Business
In my years as IPT (integrated Product Team) lead and program manager, I don’t know how many times I’ve heard from a team member: “You worry about the budget and schedule and I’ll worry about the technical. That’s my job as an…” Fill in the blank: engineer, software developer, analyst, designer, technician, assembler, etc. This attitude is a recipe for failure, a formula for schedule and cost overruns. It’s not that these are bad employees. Most of them were top-notch people. They just didn’t understand the impact their actions could have on the project and ultimately on the company’s bottom line. As engineers, they were trained to strive to achieve technical perfection. As software developers they have an innate drive to make their software better and to include more functionality. However, in the real world of business, you live under constraints and expectations that impose limitations. Resources are not unlimited. Risks are present. Customers have a delivery time expectation. Project management provides the framework for performing under the constraints imposed on the project or task. More importantly, project management serves as a discipline that relates to almost all aspects of company activities. The disciplines learned in project management can help increase productivity and the bottom line because they provide a sound methodology for examining the impact of making a change or on how you approach a problem.
As human beings we live in a physical world framed by four dimensions: length, height, width, and time. Everything we do physically can be described by those four dimensions. For instance, if you plan to meet someone at the mall, you have to state a location, which represents a place in the three physical dimensions, and a time. Thus, four dimensions. Within those four dimensions there are constraints from the physical world that limit things we can do. An example would be that gravity here on Earth is always present and makes thing fall, limiting our ability in height direction. Similarly in the world of software, upload and download throughputs are limited by the fiber optic pipes of the ISP. You get the point. We live in a physical realm whose physical laws place constraints on what is possible. We learn to navigate within these constraints.
A project is built upon a structure based on its scope, i.e., what you’re going to do; schedule, i.e., when you’re going to do it; and budget, i.e., how much it’s going to cost. You can see that there’s an inherent time element involved. For example, the budget must be spread out over time. I call the combination of these three elements of scope, schedule, and budget project space, analogous to space-time universe we live in (the three dimensions plus time) we live in. This is useful in understanding that they form a framework for projects in a manner similar to the way the three spatial dimensions plus time form the framework of the universe we live in. We must learn to navigate project space just as we lewn to navigate the physical world.
In industry, these three elements have been commonly known as the triple constraint, implying that every project event or detail must be evaluated in light of these three constraints. The Project Management Institute added, through its Guide to the Project Management Body of Knowledge (PMBOK®), \ three more constraints: quality, risk, and resources. The PMBOK® refers to these as the six constraints. I’ve given them the fun nickname, the sexi constraints, based on the Latin prefix, sex, for six (like sextuplets).
Some people in industryargue there are two other constraints: requirements and customer satisfaction. I’ll discuss that topic of how many constraints there are in a couple of other white papers. For now let’s stick to the six and understand their implications. Even if there are seven or eight it doesn’t change how you consider and respond to these constraints (except I lose the snazzy sexi nickname).
The point is, these constraints interface and effect each other, whenever you do something or make a change on a project. In performing a task on a project, you’re operating within these six constraints. You were assigned the task and it was approved by your manager (resources). Your labor must be paid for (budget). You’re expected to do it within a specified time (schedule). Your work may be reviewed (quality). The difficulty of the task must be taken into account to determine the best way to navigate any problems you may face (risk). Note that they all interact. For example, the review/quality element has a cost (labor of the reviewer(s), resource considerations (their availability and if it involves an inspection, the availability of the inspection equipment/facility), and risk (what if changes are required).
In another example to show the constrain interactions, let’s suppose you’re forced to reduce a project budget by 10%. What do you have to change to meet the goal of a 10% budget cut? Do you decrease scope by taking out some tasks? What does that do to quality and risk? Do you need to modify requirements to reflect the change, which may mean changing the project charter for an internal endeavor and the statement of work for an external contract? Or to reduce costs do you reduce quality, maybe use sampling instead of 100% inspections? Does that increase risk, such as missing a bad component? Is that another requirement modification? All of these constraints are tied together and must be considered when you make a change, or, in fact, when you perform the original project planning. They must be evaluated and balanced to reach an optimum combination that allows you to reach your goal. So, as you can see from the sexi constraints, there is more to project management than just Gantt charts and budgets. Learning to live and navigate in project space provides your employees with a new outlook on conducting business that extends to all aspects of your company operations.
Implications to the Bottom Line and to the Company Culture
Project management is a discipline that enables a project leader and his/her team members to navigate through project space and have a decent chance of arriving on time and within budget. If something untoward happens, the team is prepared to respond and recover. Furthermore, management is in a position to understand the project’s (and the team’s) progress and is less likely to be surprised if something unfortunate does occur.
What do I mean by project management being a discipline? Basically, it represents a way of thinking about and approaching tasks and problems. Even routine tasks that are not part of a project. It comes with a new awareness of accounting for all the factors impacting a task. If your company’s personnel are trained at least in the basics of project management, and now consider the six constraints when they perform a task, might they end up with a more efficient way of accomplishing that task with less negative impact on other aspects of your company? For example, if I require five signatures on this new form, what will that do to the schedule of accomplishing this task? If I reduce the number to four what are the risks? Does that impact the quality because someone isn’t in the loop? Or, in requiring five signatures, am I creating a roadblock (schedule and cost) because of the potential delays to obtain approvals? It’s a different way of looking at things. However, care must be taken that this discipline is not applied by rote, where conforming to the process is more important than the results.
Project management involves a methodology that instills a systematic way of thinking about tasks and the implications surrounding the actions taken to conduct and support those tasks. Your employees learn to appreciate the consequences of their actions. Having your people imbued with this philosophy expands the possibilities for more critical reasoning and the resulting improved efficiency, even on things that are not part of a project.
Using the Risk Management Process to Address Global Warming
I’m a trained project manager with a project management certification or PMP. As such, part of my PMP training includes risk management, a process used in industry to manage the prospective risks or uncertainties encountered during a project. After reviewing the discussion on global warming I’ve come to a conclusion that risk management needs to be applied to the global warming debate. Risk management provides an approach to dealing with an issue that has some probability of occurrence and has the potential for devastating consequences. If you know something is definitely going to happen it’s easier to weigh the costs and make a decision to deal with the consequences if they’re bad enough to warrant action. It becomes more difficult to deal with the consequences of something that might happen. In the latter case, you have to weigh the costs of mitigating something that might not happen (and therefore you’ve wasted the money) versus not doing something and dealing with the consequences. The tradeoff is like determining to purchase insurance.
The idea of my applying risk management to global warming came about during my involvement in a number of LinkedIn group discussions centering on whether global warming/climate change is real or not, and to what degree humanity is responsible for it. Some of the discussions occurred in LinkedIn discussion groups representing science organizations, or at least people interested in science, and were quite technical in nature. The discussions delved into interpretation of geological data particularly from ice cores and evidence of past climate cycles. In the discussions, it seemed to me the term global warming referred to human-influenced changes in climate, while climate change is used for natural, long term changes in climate.
The media has reported that a majority of climate scientists support the idea of human-influenced global warming. In these LinkedIn discussions I observed the scientific opposition centered on the interpretation of geological data, and the lack of validation of climate models. It was pointed out in the LinkedIn discussions that the primary climate prediction model is in its 11th generation of iteration and we’re still not accurately modeling what has occurred already, no less the future. A recent article in the NY Times confirmed that the simulations are struggling, not because they’re wrong, but because they are limited by the complexity of the system and also by current computer capabilities.
To me, with my experience as an engineering project manager, it comes as no surprise that computer models and simulations sometimes don’t match measured data perfectly. The more complex a system, the more difficult it is for a computer model to perfectly match real world data. In some instances, as in the case of climate modelling, it becomes an iterative process, where each successive version of the model gets closer to the data as the modelers gain a better understanding of the physics, i.e., the response, of the system to various inputs. Sometimes, if a system is complex enough, it becomes a matter of available computer power. However, even when the model correlation to the data isn’t perfect, the simulations can be used to predict data trends. The models then become qualitative tools to help make decisions concerning a course of action.
Climate models are among the most complicated of all technical simulations, requiring the most powerful computers we have. I expect it will be a while before we can solve these models with a fine enough grid to get us the answers we need. Problem is, while we’re waiting, the Earth may be changing.
Prompted by the discussions and my thought of applying risk management to global warming I did research into the consequences of global warming, focusing on the potential impact to the coast of the United States if the oceans rise 7-10 feet as predicted. This is one of the primary outcomes described by climate scientists. Note, I was dealing with these as potential outcomes. So if the seas rise by the levels expected, a good portion of Manhattan would be under water, as would parts of the Carolinas, the Florida peninsula, and parts of Texas. The West Coast, with its higher shorelines, at least north of Los Angeles, would be less impacted. If you look at these consequences worldwide it gets even worse. Coastal flooding due to storm surge will also increase significantly. There will be many more Hurricane Sandys, and they will become more violent.
I also investigated the predicted weather pattern shifts across the US. Increased droughts are projected for the West Coast, including more forest fires and water shortages. Parts of the Midwest would also face severe droughts, key habitat changes, and higher temperatures severely impacting its ability to continue acting as the breadbasket of America. Alterations in habitats to birds and other animals will have major consequences on the insect population. There is also an expectation for the East Coast of increased occurrences of storms like Hurricane Sandy, possibly with even more increased intensity. Weather over the Midwest is also expected to turn more violent.
These were indeed dire predictions. Even if they’re only half right, the negative impact on our economy, and the potential for loss of life are still very high. Project management practices dictate that when identifying a risk with consequences to the project potentially as dire as the global warming predictions, even if the engineering simulations were mixed or uncertain, a risk mitigation plan is required. Even if you believe the probability is only 20% that global warming is real, the consequences are significant enough to require a plan and a response.
So how do we deal with a risk like this? In industry, risk management provides a process to address risk. A quick sidelight. I received my PMP certification from the Project Management Institute, which is recognized worldwide. PMI publishes the Guide to Project Management Body of Knowledge (PMBOK®) which summarizes the best processes involved in project management. Risk management is one of those processes included in the PMBOK®. There are four methods of dealing with risk:
- Accept the risk: Acknowledge the risk and accept the consequences.Let’s look at the four options of dealing with the risks of global warming and climate change:
- Avoid the risk: Remove the risk by eliminating whatever is causing the risk
- Transfer the risk: Pass it on to someone else, e.g., purchasing insurance
- Mitigate the risk: Make changes to reduce the probability of the risk occurring or prepare plans to ameliorate the consequences once they happen
Let’s look at the four options of dealing with the risks of global warming and climate change:
- Avoiding the risk involves eliminating the causes of the risk. I don’t believe we have a really good option for avoiding global warming/climate change at this point. If the changes are the result of natural climate processes, as some advocate, there is little we can do to avoid them. If they’re due to human influence, I think it’s impractical to expect an instantaneous change to less polluting energy sources. It’s unreasonable to expect every country in the world, or even the major polluters, to stop using fossil fuels immediately. It will take a decade or more to get the plan in place and to begin making all the changes. Politically, it just isn’t going to happen. Besides, we’re already seeing some of the predicted effects. I believe we’re just too far along to avoid at least some of the global warming/climate change consequences. (This is different than mitigating them which will be described later).
- Transferring the risk is the next method of dealing with risk. To me this method is unacceptable because the consequences are even half as bad as predicted, there isn’t enough insurance money in the world to pay for the consequences, not to mention the cost in lives lost to flooding and famine. The only thing we’d be doing is transferring the consequences to our children.
- Accepting the risk signifies indicates the risk is acceded to because the cost of risk avoidance is unacceptable compared to the cost of the consequence. This category is usually used only for risks with low impact consequences or for risks with damaging consequences but with an extremely low probability of occurrence. Opponents of global warming will obviously favor this approach. It’s the one seen as having the least near-term impact on the economy because we continue on our path of utilizing fossil fuels.
- Finally, the fourth method, risk mitigation. This involves taking action to reduce the probability of occurrence of the risk or to reduce its impact. Risk mitigation, then, requires we start addressing those consequences regardless of cause (natural or manmade). For example, we can begin planning our response to coastal flooding on a national scale. If there is a significant probability that human-induced component to changes in climate, then it may also not be too late to reduce the impact and perhaps even influence the degree to which it occurs (as opposed to completely avoiding it). We can accomplish this by reducing the emission of greenhouse gases by increasing the use of alternative energy sources. Replacing old industries with new is part of the creative destruction process that has occurred throughout human industry. (See my LinkedIn Pulse post Horse Manure, Buggy Whips, and Global Warming) In creative destruction, the displaced workers often find work in the new industry.
In my opinion, it comes down to which is worse:
- Accepting the impact of global warming/climate change happening and we aren’t prepared for it with all of the consequences because we wanted to keep the status quo, or
- Waiting for 100% proof that global warming is real in order to protect the status quo and then finding out that it’s too late for many of the mitigations identified, or
- Begin the mitigations identified to reduce the effects of global warming/climate change (and accepting the cost of near-term economic dislocations) and then finding out climate change is a false alarm.I guess your answer depends on whether you care more about yourself or your grandkids.
Horse Manure, Buggy Whips, and Creative Destruction with Global Warming
The process of creative destruction is often ignored in the debate about global warming, climate change, or whatever people decide to call it, Opponents focus on the costs of making changes as we convert to renewable energy and the reduction in our carbon footprint. They claim these technology changes will damage the economy while insisting the supposed high costs of renewable energy and sustainable manufacturing will cost the United States millions of jobs and weaken our economy. To that I say balderdash.
The introduction of new technologies is accompanied by something called creative destruction when leading companies, or even industries, apparently successful at the time of the introduction, disappear. For example, the arrival of the industrial revolution brought about an end to those magnificent artisans of the pre-industrial economies. e.g., the blacksmith, the shoemaker, the weaver, etc. While those old jobs disappeared and were replaced, instead, by factory and white collar positions. We moved from a rural society to an urban society.
Along with that change came a new set of issues. In 1900 there were 100,000 or more horses in New York City, creating thousands of pounds of manure that had to removed. Hundreds if not thousands of workers toiled daily to clean up that mess. When it was introduced, the automobile was touted as a means of cleaning up the cities (among other things). I bet t the workers who cleaned those city streets along with buggy whip makers were among those who derided these new fangled toys, and probably shouted, “Get a horse!” With the introduction of the car came hundreds of companies trying to make them and capture the market. The manure workers and buggy whip makers probably also pointed to the failing early automobile companies as showing the folly of this technology. (Just like the opponents of global warming are decrying the failure of companies like Solyandra). And true to form most of these companies went out of business or were bought out. The car seemed to be a toy, a plaything of the rich, much as the Tesla electric car is today.
Then along came Henry Ford and the Model “T” automobile and everything changed. He made the Model “T” “everyman’s” car while paying the highest wages in the industry to enable his workers to afford to own their own car. Sure, at the time, it was probably still more expensive to purchase than a horse, but what you could do with it! Now the average worker could afford cars.
What do you think happened to those workers who cleaned the manure off city streets? They probably ended up with jobs paving them. And those who worked for the buggy whip makers? They found higher paying jobs in automobile factories. One man’s risk is another’s opportunity.
I’m reminded of that wonderful diatribe by Danny DeVito in the movie “Other People’s Money” where he played a 1980s style corporate raider, Larry the Liquidator, trying to take over a family-run wire-making manufacturing firm in New England. In his diatribe he talks about buggy whip makers. “You know, at one time there must’ve been dozens of companies makin’ buggy whips.And I’ll bet the last company around was the one that made the best goddamn buggy whip you ever saw.” Then the zinger. “Now how would you have liked to have been a stockholder in that company?”
Yes, there will be disruption as we switch to renewable energy and sustainable manufacturing. But in the long run, new industries will be created and the economy will grow based on those new industries. That’s just the way the world works. And, better yet, we may have saved the world for our children and grandchildren, but that’s a subject for another day.