Sometimes insight and inspiration come from an unlikely place. Recently, I was invited to join the Facebook page for my high school’s 50th reunion, which is next year. As expected, it was fun hearing from those voices from the past, though I believe the Facebook connections were only a small fraction of the 1100 graduates in our class. My high school, Brooklyn Technical High School, or Tech, as, we called it, was a specialized high school that drew its students citywide and required a test to get in. I guess Tech was one of the forerunners of today’s magnet schools. Tech’s curriculum was designed to prepare us for entry into the technical industries with course majors in aerospace, electronics, chemical engineering, etc. It was rigorous and it was tough. And, hell, I was one of those geeks running around with a pocket protector with a six-inch steel ruler clipped to it, and a slide rule clipped to my belt (and I wasn’t teased for it because that was the norm).
One of my classmates made a comment that he noticed during a visit to the school that the foundry classroom had been converted into a storeroom. Foundry was one of those classes that was supposed to prepare us for a technical career. It was a shop class on how sand molds were made to cast steel parts. We also took another class called Industrial Processes that covered how metals, wood, and plastics were processed in industry. This was all part our training to ready us for a technical career in the 1960s. One highlight of the Industrial Processes class was a road trip to a Bethlehem Steel plant in Pennsylvania to view in operation the open hearth and electric arc furnaces that fabricated steel and steel parts. Even though low-cost imports were just beginning to come in from Japan, the plant was still a thriving, busy facility. My classmate’s comment about the Foundry class struck me immediately as a metaphor for what has happened in the US in the last three or four decades. I wondered about that Bethlehem steel plant and did a quick Google search, only to learn that the company had gone bankrupt in 2001. The plant that I had visited is now a Sands Casino (according to Wikipedia).
I’m now living in a Buffalo suburb. There are many old red-brick buildings in Buffalo and Lackawanna that reminded me of the buildings at that Bethlehem steel plant. These, too, used to be factories and manufacturing facilities employing thousands of people in well-paying jobs. They are now apartment lofts and museums. Yes, SolarCity is building a new plant here that will supposedly hire 1460 workers, but that is a shadow of what industry used to employ here. There is an effort to fund biomedical startups, but no one is under the illusion that we’ll be able to match the employment of my parents’ generation. Too many of the grandkids of the workers from those old plants now have far fewer opportunities and good paying jobs in manufacturing. Maybe they can get jobs at some of the local call centers (if the centers haven’t all moved to India) and as healthcare workers taking care of their grandparents. Unfortunately, many of those jobs don’t involve benefits. So, do you think this isn’t part of what’s powering the churn and disruptions in this year’s election?
While browsing Tech’s website, I discovered the method for students to choose their majors had been changed. In the sixties, you simply chose your major. Today’s it’s a process that involves something called the Power Index (PI), where each student is ranked according to his or her academic average, with some weighting on a couple of critical courses. Students then go online and list their choice of majors in order of preference. Those with the higher PI get their first choice, etc. Why is this process necessary? I think you can guess. I bet most of the students probably want to go into computer science. Well, why not? That’s where the money is these days. Unfortunately, the tech industry has not come close to filling all those abandoned red brick building with jobs. Not when they make their hardware, and even their software, overseas.
The method of major selection is also a metaphor for today’s data driven society. At one place I worked, I was forced to rank the engineers reporting to me. The bottom 10% were mandated to be graded as Needs Improvement, even if their work was satisfactory. This was in line with Jack Welch’s philosophy of ranking all workers and firing the bottom 10% every year. Today, workers are commodities that can be discarded. Yes, I know that to manage something you need to measure it first. At least that’s the theory. Problem is, people aren’t cogs.
I remember being told, “Don’t worry, even if the Japanese take over the steel and auto industries, we still have electronics.” Then a decade later, we were told, “Don’t worry about the electronics manufacturing plants that are being moved to Korea and Taiwan, because we still have the software and engineering.” Then a decade later we were informed of the research and engineering centers being opened in China by our transnational corporations. And, so it goes. Add the impact of automation on manufacturing and the future of those kind of jobs here looks rather bleak.
The rise of Trump and Sanders in this election season comes as no surprise to me. A century ago, William Jennings Bryan led a populist revolt against industrialization. He lost, but there was a future of industrial jobs created during the Industrial Revolution that helped mitigate the transition. The Information Revolution has not supplied the equivalent number of replacement jobs and is diligently working to eliminate more of them with automation. So what’s next? Tell me what the future will be for my grandkids?
This is the first in a series of blogs addressing this issue.
See if you can identify this “discussion”.
Doubter: “The sky is not blue.”
Scientist’s Initial Response: “Can’t you see it? Just look at it.”
Doubter’s Response: “No. I’m color blind.”
Scientist’s Response: “Well, everyone else who isn’t color blind can see it.”
Doubter’s Response: “How do you know that you’re really seeing blue? How do you know that something in our diets isn’t impacting our color perception?”
Scientist’s Response: “Neurologists say that isn’t true. They’ve conducted studies to show we see the real blue.”
Doubter’s Response: “They’re all conspiring because it would be embarrassing to them to admit they hadn’t picked this up on their own. The government is paying all the researchers to support “the sky is blue theory” because of the cost of changing all those American flags to the real blue.”
Now the same argument in a Global Warming context.
Doubter: “Carbon dioxide isn’t causing Global Warming.”
Scientist: “We have the data supporting that it does.”
Doubter: “I’m not a scientist. How can I interpret this?”
Scientist: This has all been reviewed by peers and thousands of scientists around the world.
Doubter: “This is all a conspiracy by Liberals who want big government. All of the scientists are going along with it so they can be funded with work. NASA and NOAA falsified the data.”
My response to the doubter (and I hope yours): “Really?”
Note: A shortened version of this blog appeared in the Buffalo News “Another Voice” section on November 8, 2015
In one of those strange ironies of history, the automobile arrived as a response to an environmental issue of the time. In 1900, there were more than 100,000 horses in New York City and Brooklyn, creating about 4000 tons of manure and urine daily that had to be removed. Hundreds, if not thousands, of workers toiled daily to cart off that mess. Horse manure had become a significant health hazard for urban dwellers. There were even reports of a haze of manure and urine in the air in poor neighborhoods where the cleanup was not as effective.
A Rich Man’s Toy
Some saw the automobile as a potential solution to this problem. However, this was a time when automobiles were still in their infancy and could only be afforded by the wealthy. I bet those workers who mopped those city streets, along with buggy and buggy-whip makers, led those who derided these newfangled toys, with shouts of, “Get a horse!” when an early automobile drove by. Yet there was enough interest in the nascent automobile industry to spawn hundreds of automobile companies, each trying to build a better car and create a new market. Most of those companies came and went as they failed to find the magic elixir to excite the public. This is often the case with the introduction of new technologies. Witness what happened in the dot com mania of the nineteen-nineties when many of the companies touting a new business paradigm failed to succeed and create that paradigm. I bet the owners of the buggy whip makers probably pointed to failing early automobile companies as showing the folly of automobiles, just as the owners of brick and mortar establishments did during the rise of the Internet, and just as the fossil fuel companies and other opponents of global warming are pointing to the failure of solar energy companies like Solyandra as proof that renewable energy is doomed. In the first decade of the last century, the automobile seemed relegated as a toy, a plaything of the rich, much as the Tesla electric car is considered by some today.
Henry Ford: Game Changer
Then, in 1909, along came Henry Ford with his Model T automobile and everything changed. Ford touted the Model T as the “every man’s” automobile, while paying the highest wages to enable his workers to afford their own car. Sure, at the time, the Model T was probably more expensive to purchase than a horse, but what you could do with it! Suddenly the average worker could afford cars and the horse manure problem was solved. The automobile took off and became a huge industry employing thousands.
And what do you think happened to those workers who cleaned the manure off city streets when there was no more manure to remove? They probably ended up paving those streets for automobiles or they became automobile mechanics and gas station attendants. What about those workers at the buggy whip makers who lost their jobs? They found higher paying ones in automobile factories.
One man’s risk is another’s opportunity. Economists and historians have a term for this process of new industries and technologies replacing older ones: creative destruction. It’s happened time and time again in history. Prime examples include the supplanting of 19th century individual artisans with corporations driven by the Industrial Revolution-developed machinery and the aforementioned development of the Internet. In such instances there were winner and loser companies, but the winners always drove the economies to greater heights.
I’m reminded of that wonderful diatribe by Danny DeVito in the movie “Other People’s Money” (https://www.youtube.com/watch?v=62kxPyNZF3Q) where he played a 1980s style corporate raider, Larry the Liquidator, trying to take over a family-run wire-making manufacturing firm in New England. In his diatribe he talks about buggy whip makers. “You know, at one time there must’ve been dozens of companies makin’ buggy whips. And I’ll bet the last company around was the one that made the best goddamn buggy whip you ever saw.” Then came the zinger. “Now how would you have liked to have been a stockholder in that company?” Isn’t it time to replace the 20th century source of energy with a 21st century source?
Do You Want to Own a 21st Century Buggy Whip Company?
Yes, there will be some disruption as we switch to renewable energy and sustainable manufacturing. However, in the long run, new industries will be created along with new jobs, and the economy will grow based on those new industries. Germany already gets about forty per cent of its electricity from solar and other renewable sources, and still remains a competitive world industrial power. Today’s fossil fuel companies are the buggy whip makers of the 21st century.
And, yes, the automobile ultimately played a significant role in another environmental problem, but along the way it contributed to a huge leap in prosperity, helping to create the richest country in history. So now we’ll use technology to solve the problem created by automobiles and fossil fuel electricity generation. Along the way we’ll spawn whole new industries with new opportunities. That’s just how the world works. The whole arrow of human history points that way. Who knows if Solar City, Elon Musk’s and other’s bet on solar energy, is the next Ford Motor Company? Only time will tell, but it’s a step in the right direction.
More importantly, by adopting renewable energy, we may save the world for our children and grandchildren, but that’s a subject for another day.
To many, this recent stock market turndown as a hunker-down time. You know the markets will eventually recover, but you just have to ride it out. Others have sold or shorted stock on the hope of short term gains, though history shows that market timing is more difficult than it seems. To others this becomes an opportunity. Those with free cash available will also try to make a killing by timing the market and buying while prices are low. All of those represent standard “inside-the-box” thinking. I’ve always believed that one person’s downturn is another’s opportunity. During the recession that began in 1873, Andrew Carnegie used the downturn in prices and wages to build up his steel company by making investments in new equipment and hiring more workers. When the depression ended, his company was in position to take off, and the rest is history, as they say.
Now the world is facing a downturn in China’s growth. Add the stock market drop to the recent Chinese environmental and safety-related disasters, and the decrease in economic growth, and China isn’t looking like the path to riches it once did. To me China has always been a bubble waiting to burst. Well, it has. China may continue to grow but it will certainly mature at a much slower pace. Now we can decide what to do. We can hunker down and ride it out. The Federal Reserve will undoubtedly put off the expected increase in interest rates planned for the September time frame. The US economy will chug along at 2% growth or so. And a few rich people with free cash may make some more money by buying the temporarily depressed, cheaper stocks. Or we can take advantage of an opportunity here.
Sometimes an opportunity arises when two very divergent issues can be made to converge. Right now the US is faced with a deteriorating infrastructure and a lack of will to pay for it. (I’m being polite here). The highway trust fund is out of money and Congress isn’t willing to raise gas taxes to pay for it. For a Republican, voting to increase taxes can be likened to political suicide. So they voted a three month extension, kicking the can down the road.
At the same time more companies who took business offshore and now may be looking to return to solidity of the US economy and workers. The Chinese hare isn’t looking so fast any more, and the US tortoise may be looking more attractive right now. The problem is what to do with those profits that these multinationals have been stashing overseas. They certainly don’t want to pay what they consider the exorbitant 35% US taxes (though they hardly ever pay at that rate) on these profits. Bloomberg reported that these stockpiled offshore profits may be as high as $2.1 Trillion. So what to do?
Of course, Democrats certainly don’t want to lower the rates. It’s a matter of principle. Pay your fair share is their battle cry. The argument goes that the multinationals will invest any taxes saved. The democrats don’t buy that. They believe the funds saved in the reduced taxes will go the stockholders in share buybacks and not result in increased US investment. So here’s the solution. It requires people to bite down a bit on their ideological urges. Congress passes a law that reduces the tax rate to 25% on those “stashed” profits but requires all the money collected to be used only for capital improvements. In other words, use the “windfall” of taxes to fund the Highway Trust Fund and other capital improvements. To the multinationals: consider this as an investment. The capital improvements will not only enhance the nation’s efficiency, but it will provide a short-term stimulus. Construction jobs typically pay higher than minimum wage. More money in the pockets of consumers means more money to buy stuff – an opportunity for the corporations to sell stuff. The multi-nationals can also claim they are making an investment in America. Current low interest rates also provides the opportunity for lower cost investments by these multi-nationals to move manufacturing back to the US, while also providing an opportunity for the government to make more infrastructure investments through low interest bonds. The Republicans can claim this as a tax cut (35% vs 25%) and can claim they helped fix the nation’s infrastructure without raising taxes. The Democrats can declare this as a victory in gathering owed taxes and funding infrastructure. Both sides can claim they helped bring back manufacturing. (Let their spin doctors fight that one.) The ultimate winner is the American people.
The recent Space-X Falcon 9 launch failure provides new ammunition to the critics of the new space launch commercial industry. These detractors point to the United Launch Alliance’s perfect launch record and shake their heads sadly at the entrepreneurial upstarts like Space-X. “See,” they say, “we warned you. You can’t take shortcuts.” Of course, those ULA launches cost two or three times more than a Falcon launch, and are using proven rocket technology, much of which has been in place for decades.
The space industry, which includes space launch, is no stranger to failures, many of them quite spectacular and public. It comes with the territory. Some failures occur because a technology is new and you don’t know what you don’t know. People forget about Vanguard and its early failures in its attempts to launch a satellite. Sometimes rockets fail due to a hidden design flaw that may not reveal itself for many missions, e.g., the shuttle Columbia. Often design flaws result from compromises driven by outside forces such as budget. One can argue the shortcomings of the shuttle program were due to budget cuts early on. Another type of failure can be attributed to a process breakdown, The NASA Mars Climate Observer is an example of this sort, where the ground-based software used English units for the force of gravity instead of SI (Metric) units.
For Space-X, a failure at this juncture of their program is worrying because their launch history is relatively short compared to their older competition. Coming on the heels of Orbital’s Antares launch failure and the Russian Progress failure, there are additional pressures to deal with in relation to the space station resupply.
My hope is this is something Space-X will learn from and move on, but it may not be that easy. Recent interviews with Elon Musk, Space-X’s founder, indicate the cause wasn’t something simple and straightforward. Sometimes in a root cause failure investigation you get lucky, the answer pops right out, and it turns about to be something simple. Other times it takes much more digging and painful probing. Those are the ones from which you really learn something about your product or your processes. Occasionally, you may even uncover a new fundamental aspect of engineering. I find we often learn more from our failures than from our successes. And, sometimes, the deep probing reveals something basically wrong with your approach.
Not the First
There’s much at stake here beyond just resupplying the space station, or even Space-X’s future. The space launch industry is at a cusp. The industry spent the first five or six decades of its life as a government entity or at least dependent on and controlled by government agencies. Now we have a true nascent industry, one approaching the business as a commercial enterprise where the aim is to make money. Of course, the prime contractors building the vehicles and conducting the launches in the past were in business to make money, but they were doing it under the control and dependency on of government agencies such as NASA and DoD. Now the new space entrepreneurs are trying to do it on a commercial basis, in a competitive market.
This isn’t the first attempt to reduce the cost of space projects. NASA in the 1990s under Daniel Goldin attempted a “faster, better, and cheaper” approach. Sixteen projects were conducted under this umbrella. Ten successes, some spectacular (e. g., the small rovers Opportunity and Curiosity designed for a 90- day life that ran for more than a decade, and one is still operating). Six failures, all spectacular in the sense of riveting news stories. Six out of sixteen missions failures. The same detractors as those criticizing Space-X pointed to those six failures with an “aha!” and things returned to the way they were always done: near 100% success but at higher cost. However, what those detractors often ignore is that those sixteen projects under Daniel Goldin cost less than one traditional NASA project. It’s just that we live in a society where public failure is unacceptable and those with the best PR and who scream the loudest win.
History May Repeat
Reducing launch costs is one key to democratize access to space beyond a few governments and large multinational corporations. In some ways, I liken the space industry’s current status to the early years of the automobile industry or even the personal computer industry. By the early 1900s there were almost 200 automobile companies in the world, each catering to the wealthy, providing them with a new play toy: the automobile. Then along came Henry Ford and the Model-T, and things changed. Suddenly the middle class, and, later, the lower class, could afford a car. The rest, as they say, is history. Only a few of the early automobile makers survived the churn Ford caused, but the automobile became a mainstay of American life.
The computer industry has a similar history. In the early 1980’s, the personal computers made by Atari and Commodore were viewed as little more than game consoles, until Steve Jobs with the Mac and Bill Gates with MS-DOS arrived on the scene. Then computers became capable of doing office work, and, once again, the rest is history. You can point to a similar path in the development of railroads and aviation. Someone had to step up and take the risk to open up the technology to everyone.
Space Industry at a Cusp
That’s exactly where we are in the space launch industry. I’m not saying that Space-X’s Dragon capsule or Virgin Galactic’s Spaceplane 2 are today’s Model T (as much as Richard Branson might like us to believe). After all, a flight in Spaceplane 2 will run a quarter of a million dollars. Not exactly the stuff for everyman. I liken these vehicles more to Oldsmobile’s “Curved Dash”. Contrary to popular belief, Henry Ford did not discover the assembly line. That was done by Ransom Olds building his Curved Dash Oldsmobile in 1901. Ford took Olds’ concept one step further with interchangeable parts and created a vehicle better priced for the average person (as well as his attitude toward paying workers a living wage so they could afford to buy the Model T.) The Model T spelled the death knell to many industry stalwarts like the buggy whip makers who had spent so much time belittling the automobile. Economists and historians call this creative destruction. (See Danny DeVito’s rant in “Other People’s Money).
Where Do We Go From Here?
The findings on the Falcon 9 failure may prove to be critical in this evolution of our access to space. I believe Space-X is making the leap forward in launch cost reduction mainly through process change and through a less top heavy organization. Yes, Space-X has made advances in thrust-to-weight ratios of their engines but those advances are hardly revolutionary. They’re using decades-old liquid rocket engine technology that they’ve updated. They’re relying on process changes and a leaner organization for the big step in dropping costs. They’ve pulled as much fabrication and assembly as is feasible in-house so they have better control of the processes and the cost. They’re treating space launch as a business, not like the launch of the next space probe to Mars. If the cause of the failure is discovered to be something basic to their processes then the march toward everyman’s space may be diverted for the time being.
Virgin Galactic, another startup company that is focusing on space tourism, doesn’t have the final answer, either, to low-cost access to space. While their “fares” for their suborbital flights are predicted to cost $200K or more, they may still eventually commercialize suborbital flight and move it toward a more democratic availability. That will provide some commercialization success to space access but will not address the 800 pound gorilla in the room: low cost access to earth orbit which is the key to a true commercial space industry. There is a factor of sixty or more difference in the energy required to achieve orbital velocity of a sustainable orbit as compared to Spaceship 2’s Mach 2 or so. That is still the challenge everyone faces.
Still, Space-X’s attempt to bring down launch costs and to commercialize space is the next required step in the evolution of the access to space. In the commercial world perception is everything. If the cause of Space-X’s failure is proven to be intrinsic to Space-X’s commercial approach, then we’re back to the old way of doing things and space access for everyman is a long way off. We may find ourselves waiting for a truly revolutionary technology to achieve low cost space access – something like antigravity – that may never come along, or at least not for decades. On the other hand, if Space-X can find the cause of the failure and move on, then the process of creative destruction will continue. Without a revolutionary technology we may never achieve the $1-$5 per pound cost of the current airline industry (Falcon 9’s estimated launch cost are in the $1800-2500/lb range), but the launch costs still may come down enough to mimic the aviation industry of the mid-1950’s where inflation-corrected airfares were about five time higher than now. The average person flew less frequently but they still flew. So I’m watching the outcome of this failure investigation with interest.
Having spent forty years in R&D and production I’ve experienced my share of unexpected and undesired outcomes of projects, products, processes, and tasks. I’ve also been in a number of different environments where these undesired outcomes were treated very differently. I even had one boss early in my career chide me for being complacent about a successful test of a new design. He got on my case about design margins on a couple of the critical components and told me to “go break it.” Turns out he was right. Subsequent testing proved that there was not enough margin in the design. In contrast, and counter to that attitude, I had one corporate VP at another company admonish my team for a rocket test firing in which we ejected the nozzle before completion of the firing. He couldn’t accept that this was a test to examine the margins on a new design. We were attempting to determine how much material we could shave off to reduce weight, and we were operating in a regime beyond the resolution of our computer models. With customer concurrence, we had decided to conduct a test. The customer fully understood that the result was not a failure. “This company will not accept failures!” this VP proclaimed proudly at a monthly program review meeting. He was only concerned about his perception of the reputation of the company even though the customer had signed off on the test. As I indicated, our customer was perfectly satisfied with the result. The VP wasn’t. Or maybe his ego wasn’t. This was R&D and this was a test designed to push the margin. If this had been a production motor I would have been on his side about declaring it a failure, but this was R&D. Of course, this was the same VP who said he didn’t believe in his corporation investing in R&D. Instead, he believed if the company needed a technology he could just buy it. Let other companies deal with R&D failures. Well, that’s a subject for another blog.
When you get into the production arena, attitudes toward failures change. Production failures can have significant long term effects on the bottom line, customer relationships, and company reputation, depending on the industry and circumstances. Acceptance tests and inspections are conducted to ensure the quality of the unit going out the door to a customer. How does your company treat an acceptance test failure? Or an out of tolerance dimension?
I’ve worked at companies that operate on the opposite ends of the spectrum when it comes to quality and treatment of failed units. I worked for an automobile parts manufacturer delivering a $.25 spray nozzle assembled from two pressed-fit injection-molded plastic parts. The units were 100% tested. Nozzles exhibiting spray considered out of tolerance were discarded. Statistics were kept of the number of failures. No paperwork was generated for a failure. The part was just discarded. The company understood the variability in the manufacturing and assembly processes and had calculated what percentage it needed to make its required margins. Only when the rejection percentage begin to creep up (they counted the discarded units), endangering their margins would they begin an investigation. They also understood that the design was simple enough that it was highly unlikely there would be any long term latent defects hidden in the nozzle.
Contrast this to the rocket launch industry. I’ve worked for both a components supplier and for a launch integrator. The industry’s slogan concerning failures can be summed up as “one failure is a trend.” Failure during acceptance testing was seen as an indication of a potential latent failure in units, even those that passed, that could have an impact during a launch or satellite/payload operation. When you consider that the value of the payload, rocket and launch cost is typically in the hundreds of millions of dollars or more, you understand this philosophy. You only get one chance with a launch and there are no repair facilities in orbit or deep space. So the launch and satellite industry has, in general, accepted this manifesto of “one failure is a trend.” Other industries, such as healthcare, either have or are adopting similar attitudes toward failure because of the potential cost and damage impact of a systemic failure. On the other hand, I also recognize new entrepreneur-led space launch companies like Space-X are trying lower the costs of launch; how that will change the launch industry remains to be seen..
So what are the implications of the “one failure is a trend rule?” Essentially a root cause investigation must be conducted for each failure. There are many methods of performing root cause analysis (RCA) including the Five Whys, Fault Trees, “Fishbone”, and Kepner-Tregoe. It really doesn’t matter which process you use, as long as you work through the layers of design, manufacturing, human influence, etc., like peeling back an onion. The one thing you don’t want to do is take shortcuts, or jump to conclusions. I’ve been on too many of these exercises where part-way through everyone “knew the answer,” only to find, once every box was checked. that something else, often a seemingly innocuous something that no one suspected, was the cause. This is why root-cause analysis is not cheap, because it has to be comprehensive and complete.
Am I advocating using RCA in every case of a production or process failure? No, of course not. The method used on that automobile component worked for them. For the rocket launch industry, they’ve decided it’s a case of “pay something now, or pay much more later.” A company has to weigh the costs vs. the consequences and then decide how it will treat failures. If you determine an RCA is necessary, whatever method you choose, finger pointing should not be part of the process. It doesn’t mean that the consequences of a failure shouldn’t be addressed if it involves personnel. It means that everyone participating in the process understands that this is being undertaken to find and correct the cause of a failure, not to blame someone. The culture of a company will determine how this plays out.
If you have a failure and decide RCA is required, and that you need some help, Rocket Science Technologies can provide assistance. We offer a free hour of consultation with initial inquiries that may help you decide whether you need to proceed with RCA and what method suits you the best. There are no one size fits all answers but there are no shortcuts either. Rocket science involves the science of getting the details right and that is our goal in helping you.
My trusty ASUS laptop is dying. It takes forever to boot up even after cleaning out the startup files. It also has exhibited a few blue screens of death in recent weeks. It is over three years old, which in computer years is two generations (and it was certainly not the most advanced when I bought it). So what to replace it with? I ended up with a Microsoft Surface Pro 3? Why? Well, let me tell you.
Buying computers these days is worse than buying a car. So many bells and whistles, add-ons. Of course, the temptation is to buy the newest and coolest. Apple has lived off of that impulse for decades. But do I really need a Ferrari or BMW (though I guess a Prius is considered cool by those concerned about global warming)?
I decided to take stock of what I needed in a computer. I’m 65 and supposedly retired, but ended up starting a technical consulting and innovative development business. That’s a challenge in itself. Try explaining to a 30-year old Venture Capitalist that you really are an entrepreneur even though you collect social security and a pension. Well, that’s a story for a different time. Back to the laptop. It must handle Microsoft Office to support the business and my writing. I recently self-published my first novel, so it would need a keyboard. Those two factors rule out most tablets. At age 65, I’m victim to the Law of Age-related Laptop Weight Inversion. What’s that you ask? It’s a modification to Newton’s Theory of Gravitation, and Einstein’s Theory of General Relativity. It simply states that gravity has an age-related time function. At age 20, a three-pound laptop weighs a pound, or at least that’s the way it feels. By age 50, a three-pound laptop seems to weigh around three pounds, maybe a bit more. However, at 65 a three-pound laptop weighs seven or eight pounds, particularly in an airport. By the time you’re 72, its probably feels like ten or twelve pounds. Who knew that Newton’s Law of Gravitation had a time constant in it? I don’t think that not the relativity Einstein had in mind. Anyway, all of this ruminating on weight seemed to be pointing me to an ultralight. Wait, that’s an aircraft isn’t it? Ultrathin? Whatever they’re called.
I did the 21st century thing and went online, using different websites’ comparison software, and I discovered another law. You have two laptops of equal capability, but one is lighter. Which is cheaper? You would think it would be the lighter one. Less material, right? I learned that the Law of Inverse Laptop Weight ignores that factor. The lighter a laptop (usually with a smaller screen), the more it costs. I proved that empirically. The comparisons on the computer shopping websites brought that home.
After pondering all the variables in purchasing a laptop I decided I needed to touch and handle these machines to truly evaluate them. I don’t care what the purveyors of Web 2.0 think. Or is it 3.0 by now? I needed to determine how much a pound (of weight, not the British pound) was worth to me. So I decided to do the 20th century thing. Off to Best Buy we went. I dragged my wife Janeen along as my conscience in terms of cost.
Arriving at Best Buy and wandering among the machines I met the Microsoft Lady. I never caught her name. I’m not even sure what she did other than check computers for Microsoft products. For all I know she could have been a lonely old lady who liked to hang around computers but I don’t think so. She saw me looking for help and offered me some. She didn’t really care which computer I bought as long as it used Windows and wasn’t a Mac. More importantly, she was carrying a Surface Pro 3. She demonstrated how quickly it booted up. Compared to my old ASUS this machine seemed like it had a warp drive. And then I held it. Wow! My conscience spoke up. Janeen was standing at my elbow whispering in my ear about what I had said about keeping the cost down. So the Microsoft lady took me over to the cheaper laptops. I lifted one. My God! It was like lifting weights! I kept trying out different ones. Janeen grew bored and drifted off to look at cameras. Then we came to an HP 2-in-1. Their latest one. Ultra powerful. And light. Well, relatively so. Three pounds. I lifted it. No wait, ten pounds. And it cost $1100. But it was so cool with its swiveling screen and detachable pseudo-tablet. Janeen magically reappeared at my elbow and began urging me to make a decision already. “This isn’t rocket science,” she reminded me. With the help of the Microsoft Lady I realized I can get the same thing with a Surface Pro 3 and it weighed only two pounds! Of course, by the time I priced out the version I wanted with 8 gigs of memory and 256 gigs of hard drive, plus the keyboard/cover (which is extra), it came out a bit more than the HP. As expected, lighter cost more. I firmly proved the Law of Inverse Weight. Janeen sighed and gave in, happy to get out of Best Buy. I went home with my brand spanking new Microsoft Surface Pro 3.
I’ve had it for almost two weeks now. How do I like it? Like most technology I love it and hate it, with an overall lean toward love it. My biggest complaint probably has more to do with Windows 8. Once home I discovered another inverse law: the more powerful the processor, the software geeks who program it think they can make type on a screen smaller. The Surface has a default screen resolution setting of 2160 x 1440. This allows the geeks to think they can get away with 8pt font as a default. I guess they don’t know the Law of Inverse Font Size with Age. At age 20, 8 pt font seems like 12 or 15 pts while at 65 an 8pt font is really 4 pts. Windows 8 does not make it easy for you to change the defaults. Changing the resolution distorts everything plus you end up with ugly black bands around the screen. The personalization function allows you to change some font sizes but it doesn’t seem to change everything. So error messages and notifications are still too small, almost impossible to read. I can read them but at the price of squinting. I also downloaded Office365 that weekend and found that it exhibited the same law of small fonts. Outlook, in particular. I think the geeks think small is cool. It’s a struggle keeping things a size that I can read. To date there are parts of Outlook I haven’t been able to fix the font size despite repeated searches on Google.
Still, overall, I love the Surface. I even used it as a tablet at a meeting, taking notes with its stylus. (Yeah, I’m not immune to cool). Well, there goes another trusty friend, my black, bound notebook/journal. The Surface also takes up less room on my desk than the ASUS. However, another natural law prevents me from enjoying the new found spaciousness. Ever hear of the law that nature abhors a vacuum? Well, I proved its existence. My desk is as cluttered as ever.
Do I recommend the Surface? Yes. It’s a winner from a company that I grew up in the tech age with. Microsoft is 35 years old or so. In tech years that’s positively ancient. And here they are with some new innovations! Gives me hope for my own company. I can empathize with their attempts to innovate in this world of 20-something entrepreneurs. It does prove we old geezers ain’t quite dead yet.
I have just finished submitting a proposal to DARPA, the Defense Advanced Research Projects Agency, for my company Rocket Science Technologies. It was a long, grueling effort, first defining the technology and concept we were going to propose, assembling a team , writing the proposal, estimating the program/project, reviewing it, and then finally submitting. It was challenging, as proposals usually are. This one had its own particular difficulties because I had to put together a team not only to respond to the BAA, but who would also be available to work on the project should we win. My consulting company is built on a virtual basis. We have consultants scattered around the country. That wouldn’t work for the proposed program. We needed personnel who could work in a company lab to develop the required hardware. Fortunately, between LinkedIn and some networking we assembled a slam bang local team.
I consider myself a trained proposal writer and manager, having spent more than a decade with a company providing proposal training in support of their own internal proposal management system. I’ve also had training elsewhere with a similar process. The trouble is all of the training relies on the infrastructure of a centralized location, a department in the company (or supplied by an external consultant) that includes proposal and book managers (i.e., managers of the technical, management and cost volumes), editors, and illustrators, as well as a designated area isolated from the rest of the company for the team to work undisturbed. Unfortunately, in many of my recent proposal efforts, I didn’t have the luxury of that kind of infrastructure and support. Certainly not for this last proposal. Still, I think many of the lessons I learned in my formal training apply. It just requires a bit of ingenuity and perseverance to translate to a small team, or even to one-man band grant writer.
Here are some of the do’s and don’ts of writing a proposal/grant application I think you will find helpful that I’ve learned over the years:
Do: answer the question “why you or your company.” Don’t: do not write a technical treatise. One of the biggest mistakes engineers, scientists, doctors, and other professionals make is the belief that immense technical detail will sell a grant reviewer. Nothing could be more wrong. A proposal is a marketing document and is not a technical report. Its purpose is to convince someone to buy your services or product. Furthermore, the trend today is for page-limited, shorter proposals. The BAA I just responded to was for millions of dollars, and yet the page limit of the technical section was twenty pages. Even more telling, the section for describing our innovation was only three pages long. The remaining seventeen pages was a description of our approach to the problem, i.e., a mini-program plan, and a description of our team and its capabilities. The technical description is just a portion of what you need to win the grant. Remember, a request for a grant or proposal is usually issued to solve a problem the issuer has and can’t without your help. You need to convince the reviewer of the benefit of your solution and that you have the wherewithal to solve their problem. The answer is more than the technology. It’s you as a company or a researcher. Your background, talent, and past experience. It’s the approach you’re going to take. You must convince them they can trust you with their money. A technical treatise does not do that.
Do: create a theme. Creating a proposal theme is one of the first things you should do, even if it’s just you doing all the writing yourself. A theme contains customer benefit(s), feature(s) of your product/service/research that provides the benefit, and a collaboration that provides factual backup to prove your claims. Why is a theme important? Consider it your elevator speech, you know, the one you’re trained to give when someone asks you what you or your company does. A theme is a summary statement provided upfront to your customer that answers the question why you or why your company. It sets the tone and points the reader in the direction you want him or her to go. A theme also serves to focus your writing. Everything you do in the technical write-up should support your theme. It is also useful to write a theme statement for each of your major sections, again to focus your writing and to focus the grant’s reviewer on why you should be awarded the grant/contract. Answer the question why they should select you.
Don’t: Do not have the program/project manager/researcher manage the proposal. This was one of the things I suffered through on this last proposal. I was wearing both hats. If you’re a one-person band, so to speak, try to get someone to help with the details. The program/project manager/researcher is responsible for the technical content of the solution, the actual technical solution. The proposal manager is responsible for packaging the solution into a tight responsive document, ensuring it meets all the requirements of the grant RFP. On this recent BAA, because of the nature of the team I was wearing both hats. I found myself writing consultant non-disclosure agreements and consulting agreements while I was trying to writing the program plan and putting the proposal together Those were incompatible activities. Don’t try to do everything yourself, even if the grant seems small enough. Something will fall through the cracks if you do.
Do: make a plan and stick to it. Lay out a plan and a schedule and keep to it. You’ll be surprised how fast the 30 days allowed by the government for an RFP response passes. The plan doesn’t have to be a treatise. It can be a schedule, a proposal outline with page limits for each section, and assignments if others are working with you. (Don’t forget the themes.) You will find if you’re running a team that at some point you will need to say “no more technical work. Finish your writing for review.” There are few things in life that are quite such a hard stop as a proposal due date. Even for taxes you can get an extension. For most proposals you can’t.
Don’t: do not skip internal reviews. Even if you’re a one-person band you need to have someone review the outline to make sure it meets all the proposal requirements, and to read the proposal at the end. Not just copy edit. It’s best if you can find someone knowledgeable but who may not be the expert in your specific field that you are. See if you can convince them that you deserve the grant. If they get lost in your technical jargon, most likely the requesting agency’s reviewers will, too. In addition, your internal reviewer may also help you make a salient point to support your position.
Oh, and good luck.
“This is not rocket science…” How many times have you heard that expression? In general, that statement is used to indicate that whatever you’re doing is not overly complex. It’s a tribute to the perceived complexity of rocket science. But just what is rocket science? Is it some arcane form of engineering that doesn’t relate to the things done in the commercial world? Or is there more to it? And, more importantly, can rocket science be relevant in today’s fast-paced market?
Dictionary.com’s first definition of rocket science is “rocketry” (English teachers used to scream at me for using different forms of the word in the definition but dictionaries seem to get away with it). Rocketry, in turn, is defined as “the science of rocket design, development, and flight.” The website’s second definition of rocket science is “something requiring great intelligence, especially mathematical ability.” So, on the surface, it appears rocket science is just that, the science of building and launching rockets with a nod toward things being complicated. Neither of those definitions satisfies me. Based on my experience in the industry, I believe they are incomplete. Only when you get into the nitty-gritty of a space launch does the essence of rocket science become clear. Rocket science is all about getting the details right.
With a space launch there are no second chances. There are no do-overs. If the launch fails, that’s it. A billion dollars may end up in the ocean, or scattered in pieces around the launch pad, or in a useless orbit around the Earth. No second chances. Once in space you can’t bring your malfunctioning satellite or probe into a local garage for repairs. You build in redundancies when you can and work to reduce the risk as much as you can.
The launch vehicle and its payload combined have hundreds of thousands of parts, subassemblies, and assemblies that must all work and function together for success to occur. A system with a million parts and 99.99% reliability can still exhibit a hundred malfunctions, of which any one may lead to a catastrophic failure. So the emphasis in the commercial space launch world is on the details. You have to get them all right. So when it comes down to it, rocket science is really the science of managing millions of details, while also working to bring operational risk down.
How does that relate to you? Your project of replacing a machine on your assembly line, or developing a new drug, or testing your electronics package certainly doesn’t involve millions of details. True, but it still may entail hundreds, or even thousands, of interrelated tasks, components, and tests or inspections when you add up everything that has to be done. These are the details you must account for in what you do. Furthermore, there may be one detail that is ignored because your team members believe someone else must be paying attention to it. You find this out after it rises up and bites you in the butt. Or there may be a detail you just didn’t think of. There’s a reason why project management is one of the core competencies in rocket science. But it’s far from the only one.
Every component, subassembly, and assembly used in a rocket launch is either analyzed or tested to determine its suitability for use during the launch or on the payload. For many projects and products this idea of analyzing and testing everything may seem like overkill and too expensive and time consuming. In the commercial world it probably is. Until you have a problem. Let me give you a real life past example. An automobile OEM supplier couldn’t seem to get its electronics to pass its shock or vibration testing. Since the electronics were packaged the way it was done previously they were confident in the design. However, to be on the safe side, because this new version was slightly different in size and shape, they decided to run some tests. They used the same fixture they always used that never had a problem. They made what seemed like a very slight modification to that fixture to accommodate new mounting holes. Yet the parts failed.
In discussions with them, the question arose whether it was the actual electronic hardware that failed or whether it was something in the test set-up causing the failure. They didn’t have the capability to run the analysis to know whether the fixture, the environment, or the part design was the cause. My company had the capability. Our engineer on the project had done this sort of analysis countless times. We proved the fixture was resonating, adding higher loads than the electronic components would see in real life. We helped them redesign the fixture and their parts passed.
Rocket Science Technologies, Inc. has the knowledge and experience to help you in situations like this. We have the experience to guide you through this kind of failure recovery in an efficient manner to find a solution. We can also help you plan your next new product design and development to help avoid these kinds of issues. We can help you corral those details so the risk of failure or issues is significantly decreased. We can’t guarantee success, but we can improve the chances dramatically. And, in the case of something going wrong, we can help you get back on track and recover.
Rocket Science Technologies, Inc., has gathered engineers, physicists, mathematicians, and project managers as associates, available on an as-needed basis, to add their know-how to help you get past even the most challenging technical obstacles. All of RST’s personnel have shown through their careers a propensity for taking on tough problems and solving them. We relish solving technical challenges. We also understand the needs of the commercial market for reduced costs and higher velocity to production and market. Check us out at http://rocketscitech.com.
We cannot solve our problems with the same thinking we used when we created them. — Albert Einstein
A few days ago I was doing my usual finger exercises with the cable remote when I stumbled on a Charlie Rose interview with Larry Summers, the former Clinton Treasury Secretary and former Head of Obama’s Economic Council. I listened for a few moments and was about to continue with my finger exercise when something Summers said caught my attention. If you were to look at income distribution (i.e., how income is divided among the population) in the United States, and compare the division now to what it was in 1979 you would find something very troubling. If the distribution today was the same as it was in 1979, 80% of the population (mainly the middle classes) would have $1trillion dollars more than it currently has, and the top 1% would have $1trillion less. This comes to about $11,000 per family for the 80%. In discussing the causes of a slow-growth economy and income inequality, Dr. Summers pointed to the lack of demand as the cause, not lack of supply. And, yet, since the time of Ronald Raegan, the emphasis on the government’ response to the economy and its problems has increasingly been on the supply side of the economic equation, with neglect of the demand side.
A prominent economic theory in politics today seems to be that tax cuts for the rich and for corporations are the only way to stimulate the economy. The old concept of the so-called law of supply and demand seems to have been displaced by new schools of economics discounting the demand side as being unimportant. In fact, it is argued that anything done to help the demand side will negatively impact the economy. For example, think about the arguments against increasing the minimum wage. Today the focus is on supply side of economy, i.e., the rich and corporations, to the neglect of the demand side. We do this in spite of the acknowledged fact that the economy is 73% consumer driven. Where does consumer spending fall? On the demand side of the equation.
Why the focus on one side? How did this situation arise? Part of it, has to do with taxes. Everyone hates them. So a theory that purports to improve everyone’s wealth by cutting taxes is very appealing. In other words, it appears elegant. What does it mean for a theory to be elegant? In the vernacular, it means it’s “cool.” It also means it’s simple.
In physics there was an argument over the duality of a physical entity. Is light a wave or a particle? Big names in physics were divided on each side of the argument. Isaac Newton was the biggest proponent of the particle approach with his so-called corpuscle theory. On the other side were Rene Descartes, Robert Hooke, and Christiaan Huygens, all well-known physicists and mathematicians in their own right.
The apparent nail in the coffin to the particle theory came in the mid-19th century, when in 1865, James Clerk Maxwell, the brilliant Sottish mathematician and physicist published a series of equations known, not surprisingly, as Maxwell’s equations. These equations described light as a wave made up of electric and magnetic fields, the so-called electromagnetic waves. Not only were the equations extremely elegant, but they seemed to explain all the aspects of light, such as refraction, diffraction, reflection, etc. Within a decade or so Maxwell’s equations had been anointed as the as the answer to the centuries-old argument by virtually every physicist. Elegance and simplicity (at least to a physicist or mathematician), just like supply side economics.
Then a strange thing happened. While conducting experiments to further verify Maxwell’s laws, Heinrich Hertz accidentally discovered that light can stimulate metals to emit electrons, the so-called photoelectric effect. A seemingly small and unimportant discovery that was to change the world. Solar cells are a prime example of the application of the photoelectric effect. They produce electricity when exposed to sunlight. Not only did Maxwell’s Equations not predict the effect, but when applied in the right manner, they predicted the wrong answer. According to Maxwell’s Equations, varying both the wavelength (we see it as color) and the intensity of the light would change the rate of electron emission. In other words, shine a brighter light on the metal and more electrons should be emitted. Only that isn’t what occurs. Instead, the electrons only respond to the frequency of the light, not the intensity. There was a threshold frequency for each metal below which electrons were not emitted.
So the Maxwell slam dunk was suddenly derailed. His equations no longer described every aspect of light. For the next two decades, physicists searched for an explanation. It took Albert Einstein to provide the solution. His solution required light to act as a particle with a discrete energy based on the light’s frequency (color). But Einstein went a step further. He argued that light was both a wave and a particle and could act in either sense depending on the application. In other words, light consisted of photons that also acted as a wave. Einstein was awarded the Nobel Prize in 1921 for the photoelectric effect. His 1905 paper on the photoelectric effect started a revolution in physics that eventually led to the Quantum Theory (which, in turn, led to inventions such as lasers and electronic semiconductors.) So, in reality, both sides of the argument of the nature of light were right. In some cases you could use the Maxwell’s wave equations and would be correct. In other instances, you needed to use the particle aspect and Quantum Theory.
I think there’s a lesson here that applies to economics. There are two sides to economics, supply and demand. Ignore either side at your own risk. Ignore the fact that the US economy is 73% consumer driven and let the middle class fade away, and then see what happens. Let income inequality continue to expand, and then tell me who’s left to purchase the supply? The laws governing economics are two-sided. Nature tends to prefer equilibrium, i.e., a balance between two forces. For example, our sun operates as a balance, as an equilibrium between the heat generated by nuclear fusion in its core, and its massive gravity, which compresses the hydrogen in the core sufficiently to create the high temperatures required for fusion. When equilibrium is broken in nature, the effect is usually catastrophic. For the sun, when the hydrogen is expended, the equilibrium will break down and the sun will expand into a red giant, ultimately engulfing the Earth. (Don’t worry, that won’t happen for another four billion years.)
For the last three years US corporate profits have been the highest they’ve been in a long time but with minimal job creation (when compared to the increase in profits) and certainly no wage growth for the middle class. One company reported its highest profit in history and still continued to lay people off. We hear the same old arguments. Cut taxes and cut the budget. Forgo investments in our infrastructure. Keep wages suppressed. We’ve seen some recovery but not nearly what should expect at this time after a deep recession/mild depression.
We have a couple of on-going experiments now occurring that should shed some light on this (please excuse the pun). Kansas and Wisconsin. In Kansas we’ve had an extreme case of tax cuts for the wealthy. State budget deficits abound and the economy is lagging the nationwide in its recovery. In Wisconsin, we’ve had a less extreme but still an energetic application of the supply side-only application. When both are compared to Minnesota, which had a more balanced approach, they are significantly lagging Minnesota’s growth numbers. Are these the equivalent in economics to the photoelectric effect? Is it time for a more balanced theory and approach?
As Einstein said, “We cannot solve our problems with the same thinking we used when we created them.” I also think of another more-known Einstein quote: “Insanity: doing the same thing over and over again and expecting different results.”
We need a more balanced approach, one that accounts for both supply and demand. We need a change in thinking. Albert, where are you when we need you?