Category Archives: current events
Sometimes insight and inspiration come from an unlikely place. Recently, I was invited to join the Facebook page for my high school’s 50th reunion, which is next year. As expected, it was fun hearing from those voices from the past, though I believe the Facebook connections were only a small fraction of the 1100 graduates in our class. My high school, Brooklyn Technical High School, or Tech, as, we called it, was a specialized high school that drew its students citywide and required a test to get in. I guess Tech was one of the forerunners of today’s magnet schools. Tech’s curriculum was designed to prepare us for entry into the technical industries with course majors in aerospace, electronics, chemical engineering, etc. It was rigorous and it was tough. And, hell, I was one of those geeks running around with a pocket protector with a six-inch steel ruler clipped to it, and a slide rule clipped to my belt (and I wasn’t teased for it because that was the norm).
One of my classmates made a comment that he noticed during a visit to the school that the foundry classroom had been converted into a storeroom. Foundry was one of those classes that was supposed to prepare us for a technical career. It was a shop class on how sand molds were made to cast steel parts. We also took another class called Industrial Processes that covered how metals, wood, and plastics were processed in industry. This was all part our training to ready us for a technical career in the 1960s. One highlight of the Industrial Processes class was a road trip to a Bethlehem Steel plant in Pennsylvania to view in operation the open hearth and electric arc furnaces that fabricated steel and steel parts. Even though low-cost imports were just beginning to come in from Japan, the plant was still a thriving, busy facility. My classmate’s comment about the Foundry class struck me immediately as a metaphor for what has happened in the US in the last three or four decades. I wondered about that Bethlehem steel plant and did a quick Google search, only to learn that the company had gone bankrupt in 2001. The plant that I had visited is now a Sands Casino (according to Wikipedia).
I’m now living in a Buffalo suburb. There are many old red-brick buildings in Buffalo and Lackawanna that reminded me of the buildings at that Bethlehem steel plant. These, too, used to be factories and manufacturing facilities employing thousands of people in well-paying jobs. They are now apartment lofts and museums. Yes, SolarCity is building a new plant here that will supposedly hire 1460 workers, but that is a shadow of what industry used to employ here. There is an effort to fund biomedical startups, but no one is under the illusion that we’ll be able to match the employment of my parents’ generation. Too many of the grandkids of the workers from those old plants now have far fewer opportunities and good paying jobs in manufacturing. Maybe they can get jobs at some of the local call centers (if the centers haven’t all moved to India) and as healthcare workers taking care of their grandparents. Unfortunately, many of those jobs don’t involve benefits. So, do you think this isn’t part of what’s powering the churn and disruptions in this year’s election?
While browsing Tech’s website, I discovered the method for students to choose their majors had been changed. In the sixties, you simply chose your major. Today’s it’s a process that involves something called the Power Index (PI), where each student is ranked according to his or her academic average, with some weighting on a couple of critical courses. Students then go online and list their choice of majors in order of preference. Those with the higher PI get their first choice, etc. Why is this process necessary? I think you can guess. I bet most of the students probably want to go into computer science. Well, why not? That’s where the money is these days. Unfortunately, the tech industry has not come close to filling all those abandoned red brick building with jobs. Not when they make their hardware, and even their software, overseas.
The method of major selection is also a metaphor for today’s data driven society. At one place I worked, I was forced to rank the engineers reporting to me. The bottom 10% were mandated to be graded as Needs Improvement, even if their work was satisfactory. This was in line with Jack Welch’s philosophy of ranking all workers and firing the bottom 10% every year. Today, workers are commodities that can be discarded. Yes, I know that to manage something you need to measure it first. At least that’s the theory. Problem is, people aren’t cogs.
I remember being told, “Don’t worry, even if the Japanese take over the steel and auto industries, we still have electronics.” Then a decade later, we were told, “Don’t worry about the electronics manufacturing plants that are being moved to Korea and Taiwan, because we still have the software and engineering.” Then a decade later we were informed of the research and engineering centers being opened in China by our transnational corporations. And, so it goes. Add the impact of automation on manufacturing and the future of those kind of jobs here looks rather bleak.
The rise of Trump and Sanders in this election season comes as no surprise to me. A century ago, William Jennings Bryan led a populist revolt against industrialization. He lost, but there was a future of industrial jobs created during the Industrial Revolution that helped mitigate the transition. The Information Revolution has not supplied the equivalent number of replacement jobs and is diligently working to eliminate more of them with automation. So what’s next? Tell me what the future will be for my grandkids?
This is the first in a series of blogs addressing this issue.
See if you can identify this “discussion”.
Doubter: “The sky is not blue.”
Scientist’s Initial Response: “Can’t you see it? Just look at it.”
Doubter’s Response: “No. I’m color blind.”
Scientist’s Response: “Well, everyone else who isn’t color blind can see it.”
Doubter’s Response: “How do you know that you’re really seeing blue? How do you know that something in our diets isn’t impacting our color perception?”
Scientist’s Response: “Neurologists say that isn’t true. They’ve conducted studies to show we see the real blue.”
Doubter’s Response: “They’re all conspiring because it would be embarrassing to them to admit they hadn’t picked this up on their own. The government is paying all the researchers to support “the sky is blue theory” because of the cost of changing all those American flags to the real blue.”
Now the same argument in a Global Warming context.
Doubter: “Carbon dioxide isn’t causing Global Warming.”
Scientist: “We have the data supporting that it does.”
Doubter: “I’m not a scientist. How can I interpret this?”
Scientist: This has all been reviewed by peers and thousands of scientists around the world.
Doubter: “This is all a conspiracy by Liberals who want big government. All of the scientists are going along with it so they can be funded with work. NASA and NOAA falsified the data.”
My response to the doubter (and I hope yours): “Really?”
Note: A shortened version of this blog appeared in the Buffalo News “Another Voice” section on November 8, 2015
In one of those strange ironies of history, the automobile arrived as a response to an environmental issue of the time. In 1900, there were more than 100,000 horses in New York City and Brooklyn, creating about 4000 tons of manure and urine daily that had to be removed. Hundreds, if not thousands, of workers toiled daily to cart off that mess. Horse manure had become a significant health hazard for urban dwellers. There were even reports of a haze of manure and urine in the air in poor neighborhoods where the cleanup was not as effective.
A Rich Man’s Toy
Some saw the automobile as a potential solution to this problem. However, this was a time when automobiles were still in their infancy and could only be afforded by the wealthy. I bet those workers who mopped those city streets, along with buggy and buggy-whip makers, led those who derided these newfangled toys, with shouts of, “Get a horse!” when an early automobile drove by. Yet there was enough interest in the nascent automobile industry to spawn hundreds of automobile companies, each trying to build a better car and create a new market. Most of those companies came and went as they failed to find the magic elixir to excite the public. This is often the case with the introduction of new technologies. Witness what happened in the dot com mania of the nineteen-nineties when many of the companies touting a new business paradigm failed to succeed and create that paradigm. I bet the owners of the buggy whip makers probably pointed to failing early automobile companies as showing the folly of automobiles, just as the owners of brick and mortar establishments did during the rise of the Internet, and just as the fossil fuel companies and other opponents of global warming are pointing to the failure of solar energy companies like Solyandra as proof that renewable energy is doomed. In the first decade of the last century, the automobile seemed relegated as a toy, a plaything of the rich, much as the Tesla electric car is considered by some today.
Henry Ford: Game Changer
Then, in 1909, along came Henry Ford with his Model T automobile and everything changed. Ford touted the Model T as the “every man’s” automobile, while paying the highest wages to enable his workers to afford their own car. Sure, at the time, the Model T was probably more expensive to purchase than a horse, but what you could do with it! Suddenly the average worker could afford cars and the horse manure problem was solved. The automobile took off and became a huge industry employing thousands.
And what do you think happened to those workers who cleaned the manure off city streets when there was no more manure to remove? They probably ended up paving those streets for automobiles or they became automobile mechanics and gas station attendants. What about those workers at the buggy whip makers who lost their jobs? They found higher paying ones in automobile factories.
One man’s risk is another’s opportunity. Economists and historians have a term for this process of new industries and technologies replacing older ones: creative destruction. It’s happened time and time again in history. Prime examples include the supplanting of 19th century individual artisans with corporations driven by the Industrial Revolution-developed machinery and the aforementioned development of the Internet. In such instances there were winner and loser companies, but the winners always drove the economies to greater heights.
I’m reminded of that wonderful diatribe by Danny DeVito in the movie “Other People’s Money” (https://www.youtube.com/watch?v=62kxPyNZF3Q) where he played a 1980s style corporate raider, Larry the Liquidator, trying to take over a family-run wire-making manufacturing firm in New England. In his diatribe he talks about buggy whip makers. “You know, at one time there must’ve been dozens of companies makin’ buggy whips. And I’ll bet the last company around was the one that made the best goddamn buggy whip you ever saw.” Then came the zinger. “Now how would you have liked to have been a stockholder in that company?” Isn’t it time to replace the 20th century source of energy with a 21st century source?
Do You Want to Own a 21st Century Buggy Whip Company?
Yes, there will be some disruption as we switch to renewable energy and sustainable manufacturing. However, in the long run, new industries will be created along with new jobs, and the economy will grow based on those new industries. Germany already gets about forty per cent of its electricity from solar and other renewable sources, and still remains a competitive world industrial power. Today’s fossil fuel companies are the buggy whip makers of the 21st century.
And, yes, the automobile ultimately played a significant role in another environmental problem, but along the way it contributed to a huge leap in prosperity, helping to create the richest country in history. So now we’ll use technology to solve the problem created by automobiles and fossil fuel electricity generation. Along the way we’ll spawn whole new industries with new opportunities. That’s just how the world works. The whole arrow of human history points that way. Who knows if Solar City, Elon Musk’s and other’s bet on solar energy, is the next Ford Motor Company? Only time will tell, but it’s a step in the right direction.
More importantly, by adopting renewable energy, we may save the world for our children and grandchildren, but that’s a subject for another day.
To many, this recent stock market turndown as a hunker-down time. You know the markets will eventually recover, but you just have to ride it out. Others have sold or shorted stock on the hope of short term gains, though history shows that market timing is more difficult than it seems. To others this becomes an opportunity. Those with free cash available will also try to make a killing by timing the market and buying while prices are low. All of those represent standard “inside-the-box” thinking. I’ve always believed that one person’s downturn is another’s opportunity. During the recession that began in 1873, Andrew Carnegie used the downturn in prices and wages to build up his steel company by making investments in new equipment and hiring more workers. When the depression ended, his company was in position to take off, and the rest is history, as they say.
Now the world is facing a downturn in China’s growth. Add the stock market drop to the recent Chinese environmental and safety-related disasters, and the decrease in economic growth, and China isn’t looking like the path to riches it once did. To me China has always been a bubble waiting to burst. Well, it has. China may continue to grow but it will certainly mature at a much slower pace. Now we can decide what to do. We can hunker down and ride it out. The Federal Reserve will undoubtedly put off the expected increase in interest rates planned for the September time frame. The US economy will chug along at 2% growth or so. And a few rich people with free cash may make some more money by buying the temporarily depressed, cheaper stocks. Or we can take advantage of an opportunity here.
Sometimes an opportunity arises when two very divergent issues can be made to converge. Right now the US is faced with a deteriorating infrastructure and a lack of will to pay for it. (I’m being polite here). The highway trust fund is out of money and Congress isn’t willing to raise gas taxes to pay for it. For a Republican, voting to increase taxes can be likened to political suicide. So they voted a three month extension, kicking the can down the road.
At the same time more companies who took business offshore and now may be looking to return to solidity of the US economy and workers. The Chinese hare isn’t looking so fast any more, and the US tortoise may be looking more attractive right now. The problem is what to do with those profits that these multinationals have been stashing overseas. They certainly don’t want to pay what they consider the exorbitant 35% US taxes (though they hardly ever pay at that rate) on these profits. Bloomberg reported that these stockpiled offshore profits may be as high as $2.1 Trillion. So what to do?
Of course, Democrats certainly don’t want to lower the rates. It’s a matter of principle. Pay your fair share is their battle cry. The argument goes that the multinationals will invest any taxes saved. The democrats don’t buy that. They believe the funds saved in the reduced taxes will go the stockholders in share buybacks and not result in increased US investment. So here’s the solution. It requires people to bite down a bit on their ideological urges. Congress passes a law that reduces the tax rate to 25% on those “stashed” profits but requires all the money collected to be used only for capital improvements. In other words, use the “windfall” of taxes to fund the Highway Trust Fund and other capital improvements. To the multinationals: consider this as an investment. The capital improvements will not only enhance the nation’s efficiency, but it will provide a short-term stimulus. Construction jobs typically pay higher than minimum wage. More money in the pockets of consumers means more money to buy stuff – an opportunity for the corporations to sell stuff. The multi-nationals can also claim they are making an investment in America. Current low interest rates also provides the opportunity for lower cost investments by these multi-nationals to move manufacturing back to the US, while also providing an opportunity for the government to make more infrastructure investments through low interest bonds. The Republicans can claim this as a tax cut (35% vs 25%) and can claim they helped fix the nation’s infrastructure without raising taxes. The Democrats can declare this as a victory in gathering owed taxes and funding infrastructure. Both sides can claim they helped bring back manufacturing. (Let their spin doctors fight that one.) The ultimate winner is the American people.
The recent Space-X Falcon 9 launch failure provides new ammunition to the critics of the new space launch commercial industry. These detractors point to the United Launch Alliance’s perfect launch record and shake their heads sadly at the entrepreneurial upstarts like Space-X. “See,” they say, “we warned you. You can’t take shortcuts.” Of course, those ULA launches cost two or three times more than a Falcon launch, and are using proven rocket technology, much of which has been in place for decades.
The space industry, which includes space launch, is no stranger to failures, many of them quite spectacular and public. It comes with the territory. Some failures occur because a technology is new and you don’t know what you don’t know. People forget about Vanguard and its early failures in its attempts to launch a satellite. Sometimes rockets fail due to a hidden design flaw that may not reveal itself for many missions, e.g., the shuttle Columbia. Often design flaws result from compromises driven by outside forces such as budget. One can argue the shortcomings of the shuttle program were due to budget cuts early on. Another type of failure can be attributed to a process breakdown, The NASA Mars Climate Observer is an example of this sort, where the ground-based software used English units for the force of gravity instead of SI (Metric) units.
For Space-X, a failure at this juncture of their program is worrying because their launch history is relatively short compared to their older competition. Coming on the heels of Orbital’s Antares launch failure and the Russian Progress failure, there are additional pressures to deal with in relation to the space station resupply.
My hope is this is something Space-X will learn from and move on, but it may not be that easy. Recent interviews with Elon Musk, Space-X’s founder, indicate the cause wasn’t something simple and straightforward. Sometimes in a root cause failure investigation you get lucky, the answer pops right out, and it turns about to be something simple. Other times it takes much more digging and painful probing. Those are the ones from which you really learn something about your product or your processes. Occasionally, you may even uncover a new fundamental aspect of engineering. I find we often learn more from our failures than from our successes. And, sometimes, the deep probing reveals something basically wrong with your approach.
Not the First
There’s much at stake here beyond just resupplying the space station, or even Space-X’s future. The space launch industry is at a cusp. The industry spent the first five or six decades of its life as a government entity or at least dependent on and controlled by government agencies. Now we have a true nascent industry, one approaching the business as a commercial enterprise where the aim is to make money. Of course, the prime contractors building the vehicles and conducting the launches in the past were in business to make money, but they were doing it under the control and dependency on of government agencies such as NASA and DoD. Now the new space entrepreneurs are trying to do it on a commercial basis, in a competitive market.
This isn’t the first attempt to reduce the cost of space projects. NASA in the 1990s under Daniel Goldin attempted a “faster, better, and cheaper” approach. Sixteen projects were conducted under this umbrella. Ten successes, some spectacular (e. g., the small rovers Opportunity and Curiosity designed for a 90- day life that ran for more than a decade, and one is still operating). Six failures, all spectacular in the sense of riveting news stories. Six out of sixteen missions failures. The same detractors as those criticizing Space-X pointed to those six failures with an “aha!” and things returned to the way they were always done: near 100% success but at higher cost. However, what those detractors often ignore is that those sixteen projects under Daniel Goldin cost less than one traditional NASA project. It’s just that we live in a society where public failure is unacceptable and those with the best PR and who scream the loudest win.
History May Repeat
Reducing launch costs is one key to democratize access to space beyond a few governments and large multinational corporations. In some ways, I liken the space industry’s current status to the early years of the automobile industry or even the personal computer industry. By the early 1900s there were almost 200 automobile companies in the world, each catering to the wealthy, providing them with a new play toy: the automobile. Then along came Henry Ford and the Model-T, and things changed. Suddenly the middle class, and, later, the lower class, could afford a car. The rest, as they say, is history. Only a few of the early automobile makers survived the churn Ford caused, but the automobile became a mainstay of American life.
The computer industry has a similar history. In the early 1980’s, the personal computers made by Atari and Commodore were viewed as little more than game consoles, until Steve Jobs with the Mac and Bill Gates with MS-DOS arrived on the scene. Then computers became capable of doing office work, and, once again, the rest is history. You can point to a similar path in the development of railroads and aviation. Someone had to step up and take the risk to open up the technology to everyone.
Space Industry at a Cusp
That’s exactly where we are in the space launch industry. I’m not saying that Space-X’s Dragon capsule or Virgin Galactic’s Spaceplane 2 are today’s Model T (as much as Richard Branson might like us to believe). After all, a flight in Spaceplane 2 will run a quarter of a million dollars. Not exactly the stuff for everyman. I liken these vehicles more to Oldsmobile’s “Curved Dash”. Contrary to popular belief, Henry Ford did not discover the assembly line. That was done by Ransom Olds building his Curved Dash Oldsmobile in 1901. Ford took Olds’ concept one step further with interchangeable parts and created a vehicle better priced for the average person (as well as his attitude toward paying workers a living wage so they could afford to buy the Model T.) The Model T spelled the death knell to many industry stalwarts like the buggy whip makers who had spent so much time belittling the automobile. Economists and historians call this creative destruction. (See Danny DeVito’s rant in “Other People’s Money).
Where Do We Go From Here?
The findings on the Falcon 9 failure may prove to be critical in this evolution of our access to space. I believe Space-X is making the leap forward in launch cost reduction mainly through process change and through a less top heavy organization. Yes, Space-X has made advances in thrust-to-weight ratios of their engines but those advances are hardly revolutionary. They’re using decades-old liquid rocket engine technology that they’ve updated. They’re relying on process changes and a leaner organization for the big step in dropping costs. They’ve pulled as much fabrication and assembly as is feasible in-house so they have better control of the processes and the cost. They’re treating space launch as a business, not like the launch of the next space probe to Mars. If the cause of the failure is discovered to be something basic to their processes then the march toward everyman’s space may be diverted for the time being.
Virgin Galactic, another startup company that is focusing on space tourism, doesn’t have the final answer, either, to low-cost access to space. While their “fares” for their suborbital flights are predicted to cost $200K or more, they may still eventually commercialize suborbital flight and move it toward a more democratic availability. That will provide some commercialization success to space access but will not address the 800 pound gorilla in the room: low cost access to earth orbit which is the key to a true commercial space industry. There is a factor of sixty or more difference in the energy required to achieve orbital velocity of a sustainable orbit as compared to Spaceship 2’s Mach 2 or so. That is still the challenge everyone faces.
Still, Space-X’s attempt to bring down launch costs and to commercialize space is the next required step in the evolution of the access to space. In the commercial world perception is everything. If the cause of Space-X’s failure is proven to be intrinsic to Space-X’s commercial approach, then we’re back to the old way of doing things and space access for everyman is a long way off. We may find ourselves waiting for a truly revolutionary technology to achieve low cost space access – something like antigravity – that may never come along, or at least not for decades. On the other hand, if Space-X can find the cause of the failure and move on, then the process of creative destruction will continue. Without a revolutionary technology we may never achieve the $1-$5 per pound cost of the current airline industry (Falcon 9’s estimated launch cost are in the $1800-2500/lb range), but the launch costs still may come down enough to mimic the aviation industry of the mid-1950’s where inflation-corrected airfares were about five time higher than now. The average person flew less frequently but they still flew. So I’m watching the outcome of this failure investigation with interest.
We cannot solve our problems with the same thinking we used when we created them. — Albert Einstein
A few days ago I was doing my usual finger exercises with the cable remote when I stumbled on a Charlie Rose interview with Larry Summers, the former Clinton Treasury Secretary and former Head of Obama’s Economic Council. I listened for a few moments and was about to continue with my finger exercise when something Summers said caught my attention. If you were to look at income distribution (i.e., how income is divided among the population) in the United States, and compare the division now to what it was in 1979 you would find something very troubling. If the distribution today was the same as it was in 1979, 80% of the population (mainly the middle classes) would have $1trillion dollars more than it currently has, and the top 1% would have $1trillion less. This comes to about $11,000 per family for the 80%. In discussing the causes of a slow-growth economy and income inequality, Dr. Summers pointed to the lack of demand as the cause, not lack of supply. And, yet, since the time of Ronald Raegan, the emphasis on the government’ response to the economy and its problems has increasingly been on the supply side of the economic equation, with neglect of the demand side.
A prominent economic theory in politics today seems to be that tax cuts for the rich and for corporations are the only way to stimulate the economy. The old concept of the so-called law of supply and demand seems to have been displaced by new schools of economics discounting the demand side as being unimportant. In fact, it is argued that anything done to help the demand side will negatively impact the economy. For example, think about the arguments against increasing the minimum wage. Today the focus is on supply side of economy, i.e., the rich and corporations, to the neglect of the demand side. We do this in spite of the acknowledged fact that the economy is 73% consumer driven. Where does consumer spending fall? On the demand side of the equation.
Why the focus on one side? How did this situation arise? Part of it, has to do with taxes. Everyone hates them. So a theory that purports to improve everyone’s wealth by cutting taxes is very appealing. In other words, it appears elegant. What does it mean for a theory to be elegant? In the vernacular, it means it’s “cool.” It also means it’s simple.
In physics there was an argument over the duality of a physical entity. Is light a wave or a particle? Big names in physics were divided on each side of the argument. Isaac Newton was the biggest proponent of the particle approach with his so-called corpuscle theory. On the other side were Rene Descartes, Robert Hooke, and Christiaan Huygens, all well-known physicists and mathematicians in their own right.
The apparent nail in the coffin to the particle theory came in the mid-19th century, when in 1865, James Clerk Maxwell, the brilliant Sottish mathematician and physicist published a series of equations known, not surprisingly, as Maxwell’s equations. These equations described light as a wave made up of electric and magnetic fields, the so-called electromagnetic waves. Not only were the equations extremely elegant, but they seemed to explain all the aspects of light, such as refraction, diffraction, reflection, etc. Within a decade or so Maxwell’s equations had been anointed as the as the answer to the centuries-old argument by virtually every physicist. Elegance and simplicity (at least to a physicist or mathematician), just like supply side economics.
Then a strange thing happened. While conducting experiments to further verify Maxwell’s laws, Heinrich Hertz accidentally discovered that light can stimulate metals to emit electrons, the so-called photoelectric effect. A seemingly small and unimportant discovery that was to change the world. Solar cells are a prime example of the application of the photoelectric effect. They produce electricity when exposed to sunlight. Not only did Maxwell’s Equations not predict the effect, but when applied in the right manner, they predicted the wrong answer. According to Maxwell’s Equations, varying both the wavelength (we see it as color) and the intensity of the light would change the rate of electron emission. In other words, shine a brighter light on the metal and more electrons should be emitted. Only that isn’t what occurs. Instead, the electrons only respond to the frequency of the light, not the intensity. There was a threshold frequency for each metal below which electrons were not emitted.
So the Maxwell slam dunk was suddenly derailed. His equations no longer described every aspect of light. For the next two decades, physicists searched for an explanation. It took Albert Einstein to provide the solution. His solution required light to act as a particle with a discrete energy based on the light’s frequency (color). But Einstein went a step further. He argued that light was both a wave and a particle and could act in either sense depending on the application. In other words, light consisted of photons that also acted as a wave. Einstein was awarded the Nobel Prize in 1921 for the photoelectric effect. His 1905 paper on the photoelectric effect started a revolution in physics that eventually led to the Quantum Theory (which, in turn, led to inventions such as lasers and electronic semiconductors.) So, in reality, both sides of the argument of the nature of light were right. In some cases you could use the Maxwell’s wave equations and would be correct. In other instances, you needed to use the particle aspect and Quantum Theory.
I think there’s a lesson here that applies to economics. There are two sides to economics, supply and demand. Ignore either side at your own risk. Ignore the fact that the US economy is 73% consumer driven and let the middle class fade away, and then see what happens. Let income inequality continue to expand, and then tell me who’s left to purchase the supply? The laws governing economics are two-sided. Nature tends to prefer equilibrium, i.e., a balance between two forces. For example, our sun operates as a balance, as an equilibrium between the heat generated by nuclear fusion in its core, and its massive gravity, which compresses the hydrogen in the core sufficiently to create the high temperatures required for fusion. When equilibrium is broken in nature, the effect is usually catastrophic. For the sun, when the hydrogen is expended, the equilibrium will break down and the sun will expand into a red giant, ultimately engulfing the Earth. (Don’t worry, that won’t happen for another four billion years.)
For the last three years US corporate profits have been the highest they’ve been in a long time but with minimal job creation (when compared to the increase in profits) and certainly no wage growth for the middle class. One company reported its highest profit in history and still continued to lay people off. We hear the same old arguments. Cut taxes and cut the budget. Forgo investments in our infrastructure. Keep wages suppressed. We’ve seen some recovery but not nearly what should expect at this time after a deep recession/mild depression.
We have a couple of on-going experiments now occurring that should shed some light on this (please excuse the pun). Kansas and Wisconsin. In Kansas we’ve had an extreme case of tax cuts for the wealthy. State budget deficits abound and the economy is lagging the nationwide in its recovery. In Wisconsin, we’ve had a less extreme but still an energetic application of the supply side-only application. When both are compared to Minnesota, which had a more balanced approach, they are significantly lagging Minnesota’s growth numbers. Are these the equivalent in economics to the photoelectric effect? Is it time for a more balanced theory and approach?
As Einstein said, “We cannot solve our problems with the same thinking we used when we created them.” I also think of another more-known Einstein quote: “Insanity: doing the same thing over and over again and expecting different results.”
We need a more balanced approach, one that accounts for both supply and demand. We need a change in thinking. Albert, where are you when we need you?
Will the National Security Council Spearhead Government Effort to Combat Antibiotics Resistance by New Superbugs?
The President’s Council of Advisors On Science And Technology (PCAST) issued a report in September to the president on the increasing resistance of killer bacteria to existing antibiotics and the threat it poses to the United States and the rest of the world. The report received minimum fanfare but was referenced in today’s NY Times article “Superbugs Kill India’s Babies and Pose an Overseas Threat” concerning the increase in antibiotic-resistant bacteria in India and the threat this poses to the rest of the world. The PCAST report reviews the growth of this problem of drug-resistant bacteria and offers potential solutions.
The PCAST report starts off with us imagining a world without antibiotics, like it was at the turn of the 20th century when “…as many as nine women out of every 1,000 who gave birth died, 40 percent from sepsis. In some cities as many as 30 percent of children died before their first birthday. One of every nine people who developed a serious skin infection died, even from something as simple as a scrape or an insect bite…” And the list goes on. We’ve grown accustomed to having antibiotics at our beck and call. Infection? Run to the doctor and get a shot. Magically you’re cured. Only that’s now changing. In India, according to the Times article, as many as 58,000 babies die annually from infections caused by these so-called superbugs which thrive in India’s sewers, rivers, and people due to poor sanitation. While this is only a fraction of the infant deaths in India, the number is increasing dramatically every year. More importantly, now some of these bugs have migrated to Europe and the US to join those that we already have here. Furthermore, these new so-called superbugs are virtually immune to all existing antibiotics.
This issue of the growing number of so-called superbugs is not new but it’s rising in importance. According to the PCAST report the CDC estimates the cost to our economy of the health care related to these infections at $20-35B. This will only go up as more of these superbugs appear and become more common. This issue has arisen from the overuse of antibiotics in humans and in agriculture. Furthermore, there are few new antibiotics in the pipeline because of the difficulty in creating these drugs, the long and expensive development and testing process required , and the lower profit numbers associated with antibiotics.
The limitation in the profit making ability of antibiotics is not to be taken lightly. Most of the superstar drugs today are lifestyle drugs in the sense that they don’t cure you but they control symptoms over your lifetime. They’re medicine you’re likely to be on for the rest of your life. A recent Tuft study estimated the cost of bringing a drug to market at more than $2.5BLifestyle drug development costs can be recovered over years, as opposed to drugs like antibiotics which involve a short twenty or thirty pill prescription and you’re done. We can see the impact of the costs for short run drugs in lifesaving cancer drugs that are taken for a relatively short time by a limited number of people and cost hundreds of thousands per year. In contrast, the “lifestyle” drugs taken over many years, e.g., cholesterol control, heart, and arthritis drugs, cost in the $5000/yr range. The expenditures for developing both types of drug are relatively similar. It’s the number of people using them and the length of time they’re used that result in the different prices to the user. There is some concern that the development and clinical costs for antibiotics may be still higher, making it even more difficult to recover costs.
The PCAST report recommends appointing a member of the National Security Council as Director for National Antibiotic Resistance Policy (DNARP), who would report to the president to help coordinate a top-level government-wide Task Force on Combating Antibiotic Resistant Bacteria (TF-CARB) that is co-chaired by the Secretaries of Agriculture, Defense, and Health and Human Services. They will be tasked with developing steps to address the antibiotic-resistant bacteria that include:
- Expanding the surveillance of antibiotic use. The report indicates that 50% of antibiotic prescriptions are not needed and that is “a major contributor to rising antibiotic resistance.” This effort will include funding support for improved data gathering by local public health organizations to report on the use of antibiotics and to gain better data on the scope of the problem.
- Increasing the longevity of existing drugs by better managing their use, addressing outbreaks, and working to reduce the growth in antibiotic-resistant organisms. This includes addressing the issue of the use of antibiotics in agriculture.
- Increasing the rate at which new drugs to combat these infections are developed. This includes adding additional direct federal funding to support R&D, using non-traditional organizations such as DARPA (the Defense Advanced Projects Agency), addressing the costs and time required to approve these drugs, creating new economic incentives to pharmaceutical companies, and offering prizes (similar to the X prize) for new diagnostics.
Some may feel this is another power grab by the government and that it can be simply addressed by market forces. Well, the fact is, the market hasn’t responded yet because of the cost and profit issues mentioned earlier. The PCAST working group included members of the agricultural, biotech, and pharmaceutical industries from such organizations as AstraZeneca, Iroko Pharmaceuticals, GSK, Norvartis, and Smithfield Foods. To wait for the market to respond will be too late. The reduced effectiveness of antibiotics is rapidly becoming a major public health issue that will eventually impact all of us, especially our children and grandchildren. Hopefully, this effort will not be stymied by the rancor and partisanship in Washington.
Last week two of the so-called commercial space vehicles under development failed in fiery splendor. To some it’s vindication that the “NASA” way is correct. To others it displays the hubris of the billionaires funding these vehicles. Others wonder why we’re even wasting our time with this stuff when there are so many other problems in the world.
Some may think what I’m about to write as corny. Other might see it as far-fetched and pie-in-the-sky (or worse). That’s okay. Because I suspect there will be others who’ll get it. Why go into space? Why spend all the money and time and risk? Why try to cut the cost of space travel?
Why go into space? Because it’s there. Humans have always been a race of explorers. Or at least some of us have been explorers. That’s what drove us up the road to civilization. Not everyone is an explorer. When the American West was opened up in the 19th century, some elected to follow the exhortation to “Go West”, while many others elected to stay in their cities on the East Coast or on their family farms. As I said that’s okay. There should be room for both. When aviation began in its early years during the 1920s some elected to fly with the barnstormers while others elected to remain on the ground and watch. That’s okay, too. Now virtually everyone flies.
Why go into space and why spend all the money and time and risk? To ensure the survival of the human race. As long as we’re stuck on one world we’re vulnerable to destruction. It may be a natural disaster. An asteroid. Or the “ring of fire” volcanoes suddenly erupting. Or simply a plague. It could be human induced climate change. Or maybe a war. If we had substantial settlements on other worlds or in space the human race would survive. Pie-in-the-sky and far-fetched? Maybe. But it’s the pursuit of those pie-in-the-sky and far-fetched dreams that brought us to civilization.
What about the risk? No one is forcing astronauts to fly into space. No one is forcing those who h bought tickets on Virgin Galactic to purchase those tickets. Beyond that, risk is the price of advancement. Remember risk vs. reward. Or that old saying, “Nothing ventured, nothing gained.” Advancement comes at a price, and there are those who are willing to pay that price. There are those who aren’t. That’s fine, too. As for risk, what we’re seeing now is not any different than in the early days of aviation when crashes were far more common. Or even in the early days of the space program. Remember (and I’m dating myself) all those Vanguards blowing up on the launch pad before we finally got a Redstone to work and put the first US satellite into orbit? Remember Apollo One and the three astronauts who died in the fire?
Space is full of natural resources. Solar radiation which can kill is also a source of energy. Ice is plentiful. So is hydrogen. Those we know. We suspect that some of the asteroids may be rich in metals. Pie-in the sky? Maybe. But so was the transcontinental railroad. Or building a plantation in the wilds of Virginia in the 1600s. Some scoff. Some take action. History is full of people who say we can’t or shouldn’t. Fortunately, history of full of those who ignore the nay-sayers.
Why cut the cost of going into space in the face of huge risks? The American West was not really opened up until the transcontinental railroad was built. Airlines weren’t really successful until the DC-3 and later airliners cut the cost and time to travel (as well as improved reliability). Airlines really took off when the airport infrastructure was built. Even the automobile wasn’t going anywhere (excuse the pun) as a means of mass transportation until Henry Ford built the Model T for everyone.
What about those billionaires? If they are driven by ego so what? Isn’t that the definition of an entrepreneur? Someone who is so sure of what they have to do and who may be willing to risk everything. Some say they’re just playing. Well, I guess so were those rich British aristocrats and merchants who funded that plantation in Virginia in the 1600s. So, to Elon Musk and Richard Branson and the others, you have the money and the will. More power to you. Do it. It’s how we got here.
Note: Part of this appeared as a comment on a NY Times article.
I’m a trained project manager with a project management certification or PMP. As such, part of my PMP training includes risk management, a process used in industry to manage the prospective risks or uncertainties encountered during a project. After reviewing the discussion on global warming I’ve come to a conclusion that risk management needs to be applied to the global warming debate. Risk management provides an approach to dealing with an issue that has some probability of occurrence and has the potential for devastating consequences. If you know something is definitely going to happen it’s easier to weigh the costs and make a decision to deal with the consequences if they’re bad enough to warrant action. It becomes more difficult to deal with the consequences of something that might happen. In the latter case, you have to weigh the costs of mitigating something that might not happen (and therefore you’ve wasted the money) versus not doing something and dealing with the consequences. The tradeoff is like determining to purchase insurance.
The idea of my applying risk management to global warming came about during my involvement in a number of LinkedIn group discussions centering on whether global warming/climate change is real or not, and to what degree humanity is responsible for it. Some of the discussions occurred in LinkedIn discussion groups representing science organizations, or at least people interested in science, and were quite technical in nature. The discussions delved into interpretation of geological data particularly from ice cores and evidence of past climate cycles. In the discussions, it seemed to me the term global warming referred to human-influenced changes in climate, while climate change is used for natural, long term changes in climate.
The media has reported that a majority of climate scientists support the idea of human-influenced global warming. In these LinkedIn discussions I observed the scientific opposition centered on the interpretation of geological data, and the lack of validation of climate models. It was pointed out in the LinkedIn discussions that the primary climate prediction model is in its 11th generation of iteration and we’re still not accurately modeling what has occurred already, no less the future. A recent article in the NY Times confirmed that the simulations are struggling, not because they’re wrong, but because they are limited by the complexity of the system and also by current computer capabilities.
To me, with my experience as an engineering project manager, it comes as no surprise that computer models and simulations sometimes don’t match measured data perfectly. The more complex a system, the more difficult it is for a computer model to perfectly match real world data. In some instances, as in the case of climate modelling, it becomes an iterative process, where each successive version of the model gets closer to the data as the modelers gain a better understanding of the physics, i.e., the response, of the system to various inputs. Sometimes, if a system is complex enough, it becomes a matter of available computer power. However, even when the model correlation to the data isn’t perfect, the simulations can be used to predict data trends. The models then become qualitative tools to help make decisions concerning a course of action.
Climate models are among the most complicated of all technical simulations, requiring the most powerful computers we have. I expect it will be a while before we can solve these models with a fine enough grid to get us the answers we need. Problem is, while we’re waiting, the Earth may be changing.
Prompted by the discussions and my thought of applying risk management to global warming I did research into the consequences of global warming, focusing on the potential impact to the coast of the United States if the oceans rise 7-10 feet as predicted. This is one of the primary outcomes described by climate scientists. Note, I was dealing with these as potential outcomes. So if the seas rise by the levels expected, a good portion of Manhattan would be under water, as would parts of the Carolinas, the Florida peninsula, and parts of Texas. The West Coast, with its higher shorelines, at least north of Los Angeles, would be less impacted. If you look at these consequences worldwide it gets even worse. Coastal flooding due to storm surge will also increase significantly. There will be many more Hurricane Sandys, and they will become more violent.
I also investigated the predicted weather pattern shifts across the US. Increased droughts are projected for the West Coast, including more forest fires and water shortages. Parts of the Midwest would also face severe droughts, key habitat changes, and higher temperatures severely impacting its ability to continue acting as the breadbasket of America. Alterations in habitats to birds and other animals will have major consequences on the insect population. There is also an expectation for the East Coast of increased occurrences of storms like Hurricane Sandy, possibly with even more increased intensity. Weather over the Midwest is also expected to turn more violent.
These were indeed dire predictions. Even if they’re only half right, the negative impact on our economy, and the potential for loss of life are still very high. Project management practices dictate that when identifying a risk with consequences to the project potentially as dire as the global warming predictions, even if the engineering simulations were mixed or uncertain, a risk mitigation plan is required. Even if you believe the probability is only 20% that global warming is real, the consequences are significant enough to require a plan and a response.
So how do we deal with a risk like this? In industry, risk management provides a process to address risk. A quick sidelight. I received my PMP certification from the Project Management Institute, which is recognized worldwide. PMI publishes the Guide to Project Management Body of Knowledge (PMBOK®) which summarizes the best processes involved in project management. Risk management is one of those processes included in the PMBOK®. There are four methods of dealing with risk:
- Accept the risk: Acknowledge the risk and accept the consequences.Let’s look at the four options of dealing with the risks of global warming and climate change:
- Avoid the risk: Remove the risk by eliminating whatever is causing the risk
- Transfer the risk: Pass it on to someone else, e.g., purchasing insurance
- Mitigate the risk: Make changes to reduce the probability of the risk occurring or prepare plans to ameliorate the consequences once they happen
Let’s look at the four options of dealing with the risks of global warming and climate change:
- Avoiding the risk involves eliminating the causes of the risk. I don’t believe we have a really good option for avoiding global warming/climate change at this point. If the changes are the result of natural climate processes, as some advocate, there is little we can do to avoid them. If they’re due to human influence, I think it’s impractical to expect an instantaneous change to less polluting energy sources. It’s unreasonable to expect every country in the world, or even the major polluters, to stop using fossil fuels immediately. It will take a decade or more to get the plan in place and to begin making all the changes. Politically, it just isn’t going to happen. Besides, we’re already seeing some of the predicted effects. I believe we’re just too far along to avoid at least some of the global warming/climate change consequences. (This is different than mitigating them which will be described later).
- Transferring the risk is the next method of dealing with risk. To me this method is unacceptable because the consequences are even half as bad as predicted, there isn’t enough insurance money in the world to pay for the consequences, not to mention the cost in lives lost to flooding and famine. The only thing we’d be doing is transferring the consequences to our children.
- Accepting the risk signifies indicates the risk is acceded to because the cost of risk avoidance is unacceptable compared to the cost of the consequence. This category is usually used only for risks with low impact consequences or for risks with damaging consequences but with an extremely low probability of occurrence. Opponents of global warming will obviously favor this approach. It’s the one seen as having the least near-term impact on the economy because we continue on our path of utilizing fossil fuels.
- Finally, the fourth method, risk mitigation. This involves taking action to reduce the probability of occurrence of the risk or to reduce its impact. Risk mitigation, then, requires we start addressing those consequences regardless of cause (natural or manmade). For example, we can begin planning our response to coastal flooding on a national scale. If there is a significant probability that human-induced component to changes in climate, then it may also not be too late to reduce the impact and perhaps even influence the degree to which it occurs (as opposed to completely avoiding it). We can accomplish this by reducing the emission of greenhouse gases by increasing the use of alternative energy sources. Replacing old industries with new is part of the creative destruction process that has occurred throughout human industry. (See my LinkedIn Pulse post Horse Manure, Buggy Whips, and Global Warming) In creative destruction, the displaced workers often find work in the new industry.
In my opinion, it comes down to which is worse:
- Accepting the impact of global warming/climate change happening and we aren’t prepared for it with all of the consequences because we wanted to keep the status quo, or
- Waiting for 100% proof that global warming is real in order to protect the status quo and then finding out that it’s too late for many of the mitigations identified, or
- Begin the mitigations identified to reduce the effects of global warming/climate change (and accepting the cost of near-term economic dislocations) and then finding out climate change is a false alarm.I guess your answer depends on whether you care more about yourself or your grandkids.
The process of creative destruction is often ignored in the debate about global warming, climate change, or whatever people decide to call it, Opponents focus on the costs of making changes as we convert to renewable energy and the reduction in our carbon footprint. They claim these technology changes will damage the economy while insisting the supposed high costs of renewable energy and sustainable manufacturing will cost the United States millions of jobs and weaken our economy. To that I say balderdash.
The introduction of new technologies is accompanied by something called creative destruction when leading companies, or even industries, apparently successful at the time of the introduction, disappear. For example, the arrival of the industrial revolution brought about an end to those magnificent artisans of the pre-industrial economies. e.g., the blacksmith, the shoemaker, the weaver, etc. While those old jobs disappeared and were replaced, instead, by factory and white collar positions. We moved from a rural society to an urban society.
Along with that change came a new set of issues. In 1900 there were 100,000 or more horses in New York City, creating thousands of pounds of manure that had to removed. Hundreds if not thousands of workers toiled daily to clean up that mess. When it was introduced, the automobile was touted as a means of cleaning up the cities (among other things). I bet t the workers who cleaned those city streets along with buggy whip makers were among those who derided these new fangled toys, and probably shouted, “Get a horse!” With the introduction of the car came hundreds of companies trying to make them and capture the market. The manure workers and buggy whip makers probably also pointed to the failing early automobile companies as showing the folly of this technology. (Just like the opponents of global warming are decrying the failure of companies like Solyandra). And true to form most of these companies went out of business or were bought out. The car seemed to be a toy, a plaything of the rich, much as the Tesla electric car is today.
Then along came Henry Ford and the Model “T” automobile and everything changed. He made the Model “T” “everyman’s” car while paying the highest wages in the industry to enable his workers to afford to own their own car. Sure, at the time, it was probably still more expensive to purchase than a horse, but what you could do with it! Now the average worker could afford cars.
What do you think happened to those workers who cleaned the manure off city streets? They probably ended up with jobs paving them. And those who worked for the buggy whip makers? They found higher paying jobs in automobile factories. One man’s risk is another’s opportunity.
I’m reminded of that wonderful diatribe by Danny DeVito in the movie “Other People’s Money” where he played a 1980s style corporate raider, Larry the Liquidator, trying to take over a family-run wire-making manufacturing firm in New England. In his diatribe he talks about buggy whip makers. “You know, at one time there must’ve been dozens of companies makin’ buggy whips.And I’ll bet the last company around was the one that made the best goddamn buggy whip you ever saw.” Then the zinger. “Now how would you have liked to have been a stockholder in that company?”
Yes, there will be disruption as we switch to renewable energy and sustainable manufacturing. But in the long run, new industries will be created and the economy will grow based on those new industries. That’s just the way the world works. And, better yet, we may have saved the world for our children and grandchildren, but that’s a subject for another day.