Seeking Alpha

Alex Daley's  Instablog

Alex Daley
Send Message
Alex Daley is the senior editor of Casey’s Extraordinary Technology. In his varied career, he’s worked as a senior research executive, a software developer, project manager, senior IT executive, and technology marketer. He’s a technologist who has collaborated on the development of... More
My company:
Casey Research
  • Amazon.com Creates 5,000 Jobs, Destroys 25,000 In The Process?
    The Technology Jobs Conundrum

    The past few weeks have seen the tech and business media abuzz about a not-so-little warehouse in Tennessee. That's because this distribution center, opening its doors with a burst of fanfare and even a few visits from nearby politicians, isn't a jumping-off point for Macy's or Target. Instead, the warehouse is the latest in a series of new locations being opened by retail technology giant Amazon.com.

    The jobs this new mega-warehouse is purported to create: 5,000.

    The politicians who made the quick trip to Tennessee to get in front of the cameras included one President Barack Obama, who gave a rousing speech not just about the importance of the 5,000 jobs promised at this one facility (and another 2,000 corollary positions at customer service centers elsewhere in the country), but also about the "need" to raise taxes on the businesses that are creating them and use those taxes to fund "infrastructure" jobs in green energy and natural gas, among other areas.

    The politically charged speech and the press conference from Amazon.com touting its job-creation abilities have understandably drawn quite the skeptical reaction from some. But surprisingly, in all the talk I've heard in the media these past few days, most of the fury seems directed not at the proposed tax hikes, but at Amazon itself for creating what might be fewer jobs than it claims… and possibly for destroying more jobs than it creates.

    Along with its jobs announcement, Amazon's PR department was ready for the utterly predictable media criticism that the jobs are nothing more than menial, low-wage shop floor jobs. Its retort? These new jobs will pay on average 30% higher than typical retail jobs. That's a good thing for the employed, no doubt, to be paid more for a similar skill-level job.

    But with its emphasis on low prices, how can Amazon afford to boost those wages so much? It all boils down to efficiency. Think about Amazon versus another retailing giant without the same level of sales, but with a similar "low prices" kind of push: TJX companies (owners of TJ Maxx, Marshalls, and Home Goods). Last year, Amazon had retail sales of over $60 billion globally (up from $47 billion in 2011); TJX brought in about half that, at almost $26 billion globally (up from $23 billion in 2011).

    Whatever the companies' similarities, the differences between the two couldn't be bigger; and those differences have profound impacts for investors, as well as for the future of our economy.

    On one hand, you have companies like TJX-so-called "bricks and mortar" retailers-which in order to do business every day must staff a few thousand stores, and keep them open for 10, 12, or even 24 hours per day. That means greeters, checkers, security, customer service, and stockroom employees, plus bright lighting, catchy displays, and other ordinary features of a quality retail facility.

    Companies like Amazon have bricks and mortar too, of course, as displayed on the recent junket in Tennessee. But that's about all they have. Amazon can operate in facilities far off the beaten path, with nothing but wire shelves and cement floors, and they can serve just as many customers from only a fraction of the locations-and a fraction of the manpower-that their competitors require. On just about every front, the company is more efficient than its peers. But let's look specifically at two of the top expenses for virtually any retail company: its plant and its people.

    Facilities

    Last year, TJX ended with 3,055 stores, making it one of the most prolific clothing and home goods retailers in the US. It pegs its sales at a respectable $8.5 million annually per store. That's not a big-box retailer like Best Buy, which racked up $25 million per store in 2011, but it's solid compared to many other clothing and home goods retailers (like Sears, with much larger stores and only $9.5 million in per location sales, or the Gap, with smaller, mall-style stores and only $5 million per outlet).

    Amazon doesn't have stores, of course. It has "fulfillment centers," just like the new one in Tennessee that the POTUS visited. As of May 2013, a few dozen such centers are scattered across the world, including 37 in the US and 2 in Canada, and another dozen plus in Europe:

    (click to enlarge)

    Of course, it's unfair to count only fulfillment centers, as the company handles customer service at separate locations, something a retailer like TJX would normally handle mostly onsite in the stores (excepting the separate online sales division, but that constitutes only a small fraction of TJX's business). Add in those locations, plus the newest centers Amazon plans to open, and you still have fewer than 100 retail operations centers globally.

    If we round up just to be fair, that pegs Amazon's sales per location for the past year at $660 million, or about 80 times the sales per location of a TJX.

    In addition to the amount of dollars per square foot in revenue that Amazon can push out its doors, the company can strike deals to grab land in otherwise desolate areas for its facilities, so long as UPS and FedEx will service them-no need for high-rent districts with lots of shopper traffic. This means that not only does the company have to support fewer square feet of space and make each square foot less glamorous, it can also push into districts willing to offer it big tax savings, or where construction costs are highly depressed (often one and the same). Amazon's negotiation power and economies of scale are both significantly higher than the average multi-thousand storefront retailer.

    Employment

    Of course, having more sales per location means less if you have to fill that location with just as many people to get the goods out the door. Sure, you save on the overhead from electricity to property taxes, but you can only gain so much efficiency in a labor-intensive business. But here too, Amazon's differences are broadly apparent.

    TJX Companies Inc. has approximately 179,000 employees: as with any retailer, it's a mix of full-time, part-time, and seasonal workers.

    Amazon's demand is just as cyclical as any other retailer. When queried at the press conference in Tennessee, a company rep wouldn't specify how many of the 5,000 promised new jobs are part-time and seasonal. He instead jumped back to the pre-approved talking point about pay being 30% higher on average. The spokesman did acknowledge that a good number of them would not be full-time, year-round positions.

    (Amazon is able to staff much more efficiently than most retailers during non-holiday seasons, thanks to real-time, around-the-clock operations management and the ability to pull staff in for extra shifts and on short notice-a more difficult task for traditional storefront retailers, because having 10 extra people show up tomorrow from midnight to 6 a.m. doesn't help move any more product. So it's reasonable to assume Amazon will have more seasonal help as a percentage than most other retailers.)

    In all, though, Amazon has managed to grow to its current level with just about 89,000 employees, or almost exactly half of TJX's workforce. To do that yet pull down 150% more revenue makes Amazon about five times more efficient than TJX, measured by retail sales per staffer.

    Amazon is able to do so because the number of shoppers who can be served per warehouse employee is much higher than at a typical retail location-especially the smaller stores that TJX mostly uses. The same goes for all the servers the company operates: one IT guy can take the midnight shift and keep a few thousand servers up and running, but one janitor cannot clean the floors at two locations, let alone 400-and many other aspects of the business share this limitation.

    Stacking Up Against Retail's Big Boys

    TJX wasn't cherry-picked as a counterexample-there are many retailers with far worse operating margins than TJX to pick on if that was the goal. Amazon stands out head and shoulders above nearly all other retailers in efficiency. Even when stacked up against larger competitors which can bring more scale to the business than TJX can, such as retail megastore Walmart, whose $470 billion in sales don't put a dent in the revenue-per-employee figure:

     

     

     

     Wal-Mart
    (NYSE:WMT)
    CVS Caremark
    (NYSE:CVS)
    Target
    (NYSE:TGT)
    Amazon.com
    (Nasdaq
    GS:AMZN)
    TJX Companies
    (NYSE:TJX)
    # of Employees2,200,000241,500361,00088,400179,000
    Revenue TTM (000s)$470,339,000$123,098,000$73,140,000$66,848,000$26,269,900
    Rev per Employee$213,790.45$509,722.57$202,603.88$756,199.10$146,759.22

    As is obvious above, Amazon's revenue per employee is leaps and bounds above its competition. The only other one on our short list that comes close is CVS, but despite the public perception of the company as a retailer, it now generates the overwhelming majority of its revenues from its insurance programs, so it's partially miscategorized. Even so, it still doesn't come close to Amazon in terms of revenue efficiency; CVS would need to grow its sales by greater than 50% without adding a single staffer in order to catch up.

    Walmart? It would have to lay off over two-thirds of its employees or triple its same-store sales to even come close.

    Amazon's revenue efficiency is remarkable by virtually any standard. And it's one of the reasons that despite a long track record of underperforming on earnings, the company has generally stayed in the market's good graces and been granted a hefty valuation relative to its peers. The market sees Amazon for what it is-a technology company with leverage that can grow revenues in a big way-and not a more traditional retailer.

    But this also means that for every jar of Crazy Aaron's Thinking Putty that is shipped out the door, the number of people on the payroll is only one-fourth that of Walmart or Target… or one-fifth that of TJX.

    What Does This Mean for the President's Jobs Plan?

    Does this mean that Barack Obama is right? Does the government need to step in, tax these efficient companies harder, and spend that money on solar farms, wind turbines, and transport infrastructure? After all, you cannot efficiently deliver tens of millions of packages when roads are choked and bridges are crumbling.

    At first glance, it would be easy to jump to that conclusion. That's the direction I heard quite a few of Tuesday and Wednesday's commentators heading with their questions, even on business-oriented stations like my morning commute companion, Bloomberg Radio.

    And yes, in some senses, for every job Amazon "creates," four other jobs go away at a company like TJX.

    But it's not that simple. First, with any efficiency comes inefficiency on another point. The relative competitiveness of two business models is driven by the economic weight of those items in contrast to each other-an equation that changes over time. For instance, Clayton Christensen eloquently pointed out years ago in his book The Innovator's Dilemma (available at Amazon.com, of course) that at one point the commonplace hydraulic backhoes and diggers that litter every construction site in the modern world were once considered vastly inferior to their cable-actuated big brothers. Per foot, they couldn't-and still can't-dig nearly as wide a moat per pass. But it didn't matter. They won the overall market on convenient smaller sizes, easier maintenance, and better safety. Their inefficiencies were simply smaller than the value of their improvements.

    Much the same, while you can argue about Amazon's relative efficiency to TJX, you must be careful not to discount its INefficiency. For instance, at a store like TJMaxx, one truckload of goods can restock the place for weeks. To deliver the orders of a few dozen Amazon customers might require dozens of individual packages, shipped to half a dozen different sort facilities, and put on a few dozen different trucks.

    TJX benefits from all the dinosaur dung burned by its customers coming to the store and driving goods back home for it. Amazon, on the other hand, takes those nearly countless millions of extra miles and absorbs them into its business model. Compared in this way, Amazon is incredibly inefficient.

    What Amazon doesn't spend hiring direct employees, the company outsources to others, and the sum of those expenses is enormous.

    In fact, the company is so good at spending that gross margin per staffer, that it usually manages to spend it all. Just look at this graph of Amazon's gross revenue and its profits for the past five years…

    (click to enlarge)

    You might have to squint to find the red bars that represent profits. The difference between those two sets of colored bars is the amount Amazon spends on not just its 89,000 employees, but all of its operations and marketing. All of those dollars flow into the economy, be it through FedEx drivers and pilots, outsourced programmers, content licensing agreements, purchases of inventory to keep its unimaginable selection of goods growing, or any of hundreds of other paths. And each path is itself a series of jobs generated and a series of taxes that eventually do flow to the government to fund all of its activities-from building fences to keep out Mexicans to staffing the IRS to collect all that money.

    Even if the company was profitable, the same would be true. Those people who made more money would be inclined to invest and spend it. The higher the relative profitability, the more free cash flow, and the more likely it is to turn into additional spending.

    Any company whose technology improves efficiency in these ways is going to reduce employment in the affected industry. That's the breaks. But it does not necessarily mean those jobs are gone forever.

    A time will come when the amount of efficiency gained by automation, aggregation, logistics, and other techniques puts another significant and serious dent in the demand for labor-we've already seen it with low-skill manufacturing jobs, and we are seeing it take hold now in retail sales. But that does not have to mean the end of the economy or an economy of no jobs. Instead, the efficiency flows outward to create new opportunities surrounding the new business and its investors.

    Revenue efficiency creates demand for business services. What is not done in house gets pushed to other networks of suppliers-services contracted by the sellers or by the buyers. Also, profit efficiency creates income for investors, which gets reinvested into more new businesses. Just look to the Silicon Valley venture-capital scene, and you'll see years of investment gains flowing back into the markets to the tune of multiple billions of dollars per year in investments into companies bent on reinventing every area of the economy, from entertainment to building supplies to biotechnology to car rental, through technology.

    The biggest risk to our economy and employment is not innovation, which speeds up the flow of capital. Rather, the risks to be on the lookout for are the ones that compromise the ability of businesses to innovate, or which otherwise impede the flow of capital-i.e., those that restrict lending, cause people or companies to hoard cash, or generate fear and distrust among trading partners.

    Fears over banking failures in 2008 showed just how serious this risk can be. And while the short-term reaction of shoring up liquidity in financial markets may look to have been a smart one in retrospect, the long-term lack of consequences for outright criminal activity in that sector are damning for the future of the economy. Trust in our government, bankers, and currency continues to be at all-time lows, a factor that can only be slowing the return of economic growth.

    If Barack Obama and camp want to generate more government revenues, they should consider backing off on the new taxes and focus instead on limiting the regulations that impede the flow of capital. They should help secure confidence in the economy, not just through buoying up the stock markets, but through justice and prudent oversight over the value of our money and security of our savings.

    Companies like Amazon thankfully continue to invest and grow despite the rising tide of operational regulations (for all the deregulation in banking over the past 20 years, the same cannot be said of any other sector) and the continued political immunity of the banking class. As it does so, its innovations are making life better for the majority of people. While the industry-wide shift of employment will be painful to some, over time the economy will adjust, as it always has, and other employment opportunities will fill the gaps created.

    Plus, it's not like Amazon has a lock on retail efficiency. After all, Costco-the star quarterback of the big-box store movement-generates over $975,000 in sales per employee. Yes, it seems like even Amazon could still learn a few things.

    As for the question of whether more efficient companies-which employ software and even robots (as Amazon does in its warehouses increasingly every year) to save on labor-destroy more jobs than they create… it's not as simple as it might seem at first. Innovation ultimately benefits society, so we must be careful not to hamper it for fear of the unknown future it will bring.

    While technology jobs pose a conundrum for politicians, companies like Amazon will continue to grow their market share and make handsome profits for investors. If you aren't making your share of them, consider a risk-free trial subscription to BIG TECH today, and see the actionable and profitable investment advice our technology team provides for yourself. Learn more and get started today.

    Disclosure: I have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

    Aug 02 3:31 PM | Link | Comment!
  • The Coming Water Wars

    Water is not scarce. It is made up of the first and third most common elements in the universe, and the two readily react to form a highly stable compound that maintains its integrity even at temperature extremes.

    Hydrologist Dr. Vincent Kotwicki, in his paper Water in the Universe, writes:

    "Water appears to be one of the most abundant molecules in the Universe. It dominates the environment of the Earth and is a main constituent of numerous planets, moons and comets. On a far greater scale, it possibly contributes to the so-called 'missing mass' [i.e., dark matter] of the Universe and may initiate the birth of stars inside the giant molecular clouds."

    Oxygen has been found in the newly discovered "cooling flows" - heavy rains of gas that appear to be falling into galaxies from the space once thought empty surrounding them, giving rise to yet more water.

    How much is out there? No one can even take a guess, since no one knows the composition of the dark matter that makes up as much as 90% of the mass of the universe. If comets, which are mostly ice, are a large constituent of dark matter, then, as Dr. Kotwicki writes, "the remote uncharted (albeit mostly frozen) oceans are truly unimaginably big."

    Back home, Earth is often referred to as the "water planet," and it certainly looks that way from space. H2O covers about 70% of the surface of the globe. It makes all life as we know it possible.

    The Blue Planet?

    However it got here - theories abound from outgassing of volcanic eruptions to deposits by passing comets and ancient crossed orbits - water is what gives our planet its lovely, unique blue tint, and there appears to be quite a lot of it.

    That old axiom that the earth is 75% water... not quite. In reality, water constitutes only 0.07% of the earth by mass, or 0.4% by volume.

    This is how much we have, depicted graphically:

    Credit: Howard Perlman, USGS; globe illustration by Jack Cook, Woods Hole
    Oceanographic Institution (©); Adam Nieman.

    What this shows is the relative size of our water supply if it were all gathered together into a ball and superimposed on the globe.

    The large blob, centered over the western US, is all water (oceans, icecaps, glaciers, lakes, rivers, groundwater, and water in the atmosphere). It's a sphere about 860 miles in diameter, or roughly the distance from Salt Lake City to Topeka. The smaller sphere, over Kentucky, is the fresh water in the ground and in lakes, rivers, and swamps.

    Now examine the image closely. See that last, tiny dot over Georgia? It's the fresh water in lakes and rivers.

    Looked at another way, that ball of all the water in the world represents a total volume of about 332.5 million cubic miles. But of this, 321 million mi3, or 96.5%, is saline - great for fish, but undrinkable without the help of nature or some serious hardware. That still leaves a good bit of fresh water, some 11.6 million mi3, to play with. Unfortunately, the bulk of that is locked up in icecaps, glaciers, and permanent snow, or is too far underground to be accessible with today's technology. (The numbers come from the USGS; obviously, they are estimates and they change a bit every year, but they are accurate enough for our purposes.)

    Accessible groundwater amounts to 5.614 million mi3, with 55% of that saline, leaving a little over 2.5 million mi3 of fresh groundwater. That translates to about 2.7 exa-gallons of fresh water, or about 2.7 billion billion gallons (yes billions of billions, or 1018 in scientific notation), which is about a third of a billion gallons of water per person. Enough to take a long shower every day for many lifetimes...

    However, not all of that groundwater is easily or cheaply accessible. The truth is that the surface is the source for the vast majority - nearly 80% - of our water. Of surface waters, lakes hold 42,320 mi3, only a bit over half of which is fresh, and the world's rivers hold only 509 mi3 of fresh water, less than 2/10,000 of 1% of the planetary total.

    And that's where the problem lies. In 2005 in the US alone, we humans used about 328 billion gallons of surface water per day, compared to about 83 billion gallons per day of water from the ground. Most of that surface water, by far, comes from rivers. Among these, one of the most important is the mighty Colorado.

    Horseshoe Bend, in Page, AZ. (AP Photo)

    Tapping Ol' Man River

    Or perhaps we should say "the river formerly known as the mighty Colorado." That old Colorado - the one celebrated in centuries of American Western song and folklore; the one that exposed two billion years of geologic history in the awesome Grand Canyon - is gone. In its place is… well, Las Vegas - the world's gaudiest monument to hubristic human overreach, and a big neon sign advertising the predicament now faced by much of the world.

    It's well to remember that most of the US west of the Mississippi ranges from relatively dry to very arid, to desert, to lifeless near-moonscapes. The number of people that could be supported by the land, especially in the Southwest, was always small and concentrated along the riverbanks. Tribal clusters died out with some regularity. And that's the way it would have remained, except for a bit of ingenuity that suddenly loosed two powerful forces on the area: electrical power, and an abundance of water that seemed as limitless as the sky.

    In September of 1935, President Roosevelt dedicated the pinnacle of engineering technology up to that point: Hoover Dam. The dam did two things. It served as a massive hydroelectric generating plant, and it backed up the Colorado River behind it, creating Lake Mead, the largest reservoir in the country.

    Early visitors dubbed Hoover Dam the "Eighth Wonder of the World," and it's easy to see why. It was built on a scale unlike anything before it. It's 725 feet high and contains 6 million tons of concrete, which would pave a road from New York to Los Angeles. Its 19 generators produce 2,080 MW of electricity, enough to power 1.75 million average homes.

    The artificially created Lake Mead is 112 miles long, with a maximum depth of 590 feet. It has a surface area of 250 square miles and an active capacity of 16 million acre-feet.

    Hoover Dam was intended to generate sufficient power and impound an ample amount of water, to meet any conceivable need. But as things turned out, grand as the dam is, it wasn't conceived grandly enough... because it is 35 miles from Las Vegas, Nevada.

    Vegas had a permanent population in 1935 of 8,400, a number that swelled to 25,000 during the dam construction as workers raced in to take jobs that were scarce in the early Depression years. Those workers, primarily single men, needed something to do with their spare time, so the Nevada state legislature legalized gambling in 1931. Modern Vegas was born.

    The rise of Vegas is well chronicled, from a middle-of-nowhere town to the largest city founded in the 20th century and the fastest-growing in the nation - up until the 2008 housing bust. Somehow, those 8,400 souls turned into a present population of over 2 million that exists all but entirely to service the 40 million tourists who visit annually. And all this is happening in a desert that sees an average of 10 days of measurable rainfall per year, totaling about 4 inches.

    In order to run all those lights, fountains, and revolving stages, Las Vegas requires 5,600 MW of electricity on a summer day. Did you notice that that's more than 2.5 times what the giant Hoover Dam can put out? Not to mention that those 42 million people need a lot of water to drink to stay properly hydrated in the 100+ degree heat. And it all comes from Lake Mead.

    So what do you think is happening to the lake?

    If your guess was, "it's shrinking," you're right. The combination of recent drought years in the West and rapidly escalating demand has been a dire double-whammy, reducing the lake to 40% full. Normally, the elevation of Lake Mead is 1,219 feet. Today, it's at 1,086 feet and dropping by ten feet a year (and accelerating). That's how much more water is being taken out than is being replenished.

    This is science at its simplest. If your extraction of a renewable resource exceeds its ability to recharge itself, it will disappear - end of story. In the case of Lake Mead, that means going dry, an eventuality to which hydrologists assign a 50% probability in the next twelve years. That's by 2025.

    Nevadans are not unaware of this. There is at the moment a frantic push to get approval for a massive pipeline project designed to bring in water from the more favored northern part of the state. Yet even if the pipeline were completed in time, and there is stiff opposition to it (and you thought only oil pipelines gave way to politics and protests), that would only resolve one issue. There's another. A big one.

    Way before people run out of drinking water, something else happens: When Lake Mead falls below 1,050 feet, the Hoover Dam's turbines shut down - less than four years from now, if the current trend holds - and in Vegas the lights start going out.

    What Doesn't Stay in Vegas

    Ominously, these water woes are not confined to Las Vegas. Under contracts signed by President Obama in December 2011, Nevada gets only 23.37% of the electricity generated by the Hoover Dam. The other top recipients: Metropolitan Water District of Southern California (28.53%); state of Arizona (18.95%); city of Los Angeles (15.42%); and Southern California Edison (5.54%).

    You can always build more power plants, but you can't build more rivers, and the mighty Colorado carries the lifeblood of the Southwest. It services the water needs of an area the size of France, in which live 40 million people. In its natural state, the river poured 15.7 million acre-feet of water into the Gulf of California each year. Today, twelve years of drought have reduced the flow to about 12 million acre-feet, and human demand siphons off every bit of it; at its mouth, the riverbed is nothing but dust.

    Nor is the decline in the water supply important only to the citizens of Las Vegas, Phoenix, and Los Angeles. It's critical to the whole country. The Colorado is the sole source of water for southeastern California's Imperial Valley, which has been made into one of the most productive agricultural areas in the US despite receiving an average of three inches of rain per year.

    The Valley is fed by an intricate system consisting of 1,400 miles of canals and 1,100 miles of pipeline. They are the only reason a bone-dry desert can look like this:

    Intense conflicts over water will probably not be confined to the developing world. So far, Arizona, California, Nevada, New Mexico, and Colorado have been able to make and keep agreements defining who gets how much of the Colorado River's water. But if populations continue to grow while the snowcap recedes, it's likely that the first shots will be fired before long, in US courtrooms. If legal remedies fail… a war between Phoenix and LA might seem far-fetched, but at the minimum some serious upheaval will eventually ensue unless an alternative is found quickly.

    A Litany of Crises

    Water scarcity is, of course, not just a domestic issue. It is far more critical in other parts of the world than in the US. It will decide the fate of people and of nations.

    Worldwide, we are using potable water way faster than it can be replaced. Just a few examples:

    • The legendary Jordan River is flowing at only 2% of its historic rate.
    • In Africa, desertification is proceeding at an alarming rate. Much of the northern part of the continent is already desert, of course. But beyond that, a US Department of Agriculture study places about 2.5 million km2 of African land at low risk of desertification, 3.6 million km2 at moderate risk, 4.6 million km2 at high risk, and 2.9 million km2 at very high risk. "The region that has the highest propensity," the report says, "is located along the desert margins and occupies about 5% of the land mass. It is estimated that about 22 million people (2.9% of the total population) live in this area."
    • A 2009 study published in the American Meteorological Society's Journal of Climate analyzed 925 major rivers from 1948 to 2004 and found an overall decline in total discharge. The reduction in inflow to the Pacific Ocean alone was about equal to shutting off the Mississippi River. The list of rivers that serve large human populations and experienced a significant decline in flow includes the Amazon, Congo, Chang Jiang (Yangtze), Mekong, Ganges, Irrawaddy, Amur, Mackenzie, Xijiang, Columbia, and Niger.

    Supply is not the only issue. There's also potability. Right now, 40% of the global population has little to no access to clean water, and despite somewhat tepid modernization efforts, that figure is actually expected to jump to 50% by 2025. When there's no clean water, people will drink dirty water - water contaminated with human and animal waste. And that breeds illness. It's estimated that fully half of the world's hospital beds today are occupied by people with water-borne diseases.

    Food production is also a major contributor to water pollution. To take two examples:

    • The "green revolution" has proven to have an almost magical ability to provide food for an ever-increasing global population, but at a cost. Industrial cultivation is extremely water intensive, with 80% of most US states' water usage going to agriculture - and in some, it's as high as 90%. In addition, factory farming uses copious amounts of fertilizer, herbicides, and pesticides, creating serious problems for the water supply because of toxic runoff.
    • Modern livestock facilities - known as concentrated animal feeding operations (CAFOs) - create enormous quantities of animal waste that is pumped into holding ponds. From there, some of it inevitably seeps into the groundwater, and the rest eventually has to be dumped somewhere. Safe disposal practices are often not followed, and regulatory oversight is lax. As a result, adjacent communities' drinking water can come to contain dangerously high levels of E. coli bacteria and other harmful organisms.

    Not long ago, scientists discovered a whole new category of pollutants that no one had previously thought to test for: drugs. We are a nation of pill poppers and needle freaks, and the drugs we introduce into our bodies are only partially absorbed. The remainder is excreted and finds its way into the water supply. Samples recently taken from Lake Mead revealed detectable levels of birth control medication, steroids, and narcotics... which people and wildlife are drinking.

    Most lethal of all are industrial pollutants that continue to find their way into the water supply. The carcinogenic effects of these compounds have been well documented, as the movie-famed Erin Brockovich did with hexavalent chromium.

    But the problem didn't go away with Brockovich's court victory. The sad fact is that little has changed for the better. In the US, our feeble attempt to deal with these threats was the passage in 1980 of the so-called Superfund Act. That law gave the federal government - and specifically the Environmental Protection Agency (EPA) - the authority to respond to chemical emergencies and to clean up uncontrolled or abandoned hazardous-waste sites on both private and public lands. And it supposedly provided money to do so.

    How's that worked out? According to the Government Accountability Office (NYSE:GAO), "After decades of spearheading restoration efforts in areas such as the Great Lakes and the Chesapeake Bay, improvements in these water bodies remain elusive … EPA continues to face the challenges posed by an aging wastewater infrastructure that results in billions of gallons of untreated sewage entering our nation's water bodies … Lack of rapid water-testing methods and development of current water quality standards continue to be issues that EPA needs to address."

    Translation: the EPA hasn't produced. How much of this is due to the typical drag of a government bureaucracy and how much to lack of funding is debatable. Whether there might be a better way to attack the problem is debatable. But what is not debatable is the magnitude of the problem stacking up, mostly unaddressed.

    Just consider that the EPA has a backlog of 1,305 highly toxic Superfund cleanup sites on its to-do list, in every state in the union (except apparently North Dakota, in case you want to try to escape - though the proliferation of hydraulic fracking in that area may quickly change the map, according to some of its detractors - it's a hotly debated assertion).

    About 11 million people in the US, including 3-4 million children, live within one mile of a federal Superfund site. The health of all of them is at immediate risk, as is that of those living directly downstream.

    We could go on about this for page after page. The situation is depressing, no question. And even more so is the fact that there's little we can do about it. There is no technological quick fix.

    Peak oil we can handle. We find new sources, we develop alternatives, and/or prices rise. It's all but certain that by the time we actually run out of oil, we'll already have shifted to something else.

    But "peak water" is a different story. There are no new sources; what we have is what we have. Absent a profound climate change that turns the evaporation/rainfall hydrologic cycle much more to our advantage, there likely isn't going to be enough to around.

    As the biosphere continually adds more billions of humans (the UN projects there will be another 3.5 billion people on the planet, a greater than 50% increase, by 2050 before a natural plateau really starts to dampen growth), the demand for clean water has the potential to far outstrip dwindling supplies. If that comes to pass, the result will be catastrophic. People around the world are already suffering and dying en masse from lack of access to something drinkable... and the problems look poised to get worse long before they get better.

    Searching for a Way Out

    With a problem of this magnitude, there is no such thing as a comprehensive solution. Instead, it will have to be addressed by chipping away at the problem in a number of ways, which the world is starting to do.

    With much water not located near population centers, transportation will have to be a major part of the solution. With oil, a complex system of pipelines, tankers, and trucking fleets has been erected, because it's been profitable to do so. The commodity has a high intrinsic value. Water doesn't - or at least hasn't in most of the modern era's developed economies - and thus delivery has been left almost entirely to gravity. Further, the construction of pipelines for water that doesn't flow naturally means taking a vital resource from someone and giving it to someone else, a highly charged political and social issue that's been known to lead to protest and even violence. But until we've piped all the snow down from Alaska to California, transportation will be high on the list of potential near term solutions, especially to individual supply crunches, just as it has been with energy.

    Conservation measures may help too, at least in the developed world, though the typical lawn-watering restrictions will hardly make a dent. Real conservation will have to come from curtailing industrial uses like farming and fracking.

    But these bandage solutions can only forestall the inevitable without other advances to address the problems. Thankfully, where there is a challenge, there are always technology innovators to help address it. It was wells and aqueducts that let civilization move from the riverbank inland, irrigation that made communal farming scale, and sewers and pipes that turned villages into cities, after all. And just as with the dawn of industrial water, entrepreneurs are developing some promising tech developments, too.

    Given how much water we use today, there's little doubt that conservation's sibling, recycling, is going to be big. Microfiltration systems are very sophisticated and can produce recycled water that is near-distilled in quality. Large-scale production remains a challenge, as is the reluctance of people to drink something that was reclaimed from human waste or industrial runoff. But that might just require the right spokesperson. California believes so, in any case, as it forges ahead with its Porcelain Springs initiative. A company called APTwater has taken on the important task of purifying contaminated leachate water from landfills that would otherwise pollute the groundwater. This is simply using technology to accelerate the natural process of replenishment by using energy, but if it can be done at scale, we will eventually reach the point where trading oil or coal for clean drinking water makes economic sense. It's already starting to in many places.

    Inventor Dean Kamen of Segway fame has created the Slingshot, a water-purification machine that could be a lifesaver for small villages in more remote areas. The size of a dorm-room refrigerator, it can produce 250 gallons of water a day, using the same amount of energy it takes to run a hair dryer, provided by an engine that can burn just about anything (it's been run on cow dung). The Slingshot is designed to be maintenance-free for at least five years.

    Kamen says you can "stick the intake hose into anything wet - arsenic-laden water, salt water, the latrine, the holding tanks of a chemical waste treatment plant; really, anything wet - and the outflow is one hundred percent pure pharmaceutical-grade injectable water."

    That naturally presupposes there is something wet to tap into. But Coca-Cola, for one, is a believer. This September, Coke entered into a partnership with Kamen's company, Deka Research, to distribute Slingshots in Africa and Latin America.

    Ceramic filters are another, low-tech option for rural areas. Though clean water output is very modest, they're better than nothing. The ability to decontaminate stormwater runoff would be a boon for cities, and AbTech Industries is producing a product to do just that.

    In really arid areas, the only water present may be what's held in the air. Is it possible to tap that source? "Yes," say a couple of cutting-edge tech startups. Eole Water proposes to extract atmospheric moisture using a wind turbine. Another company, NBD Nano, has come up with a self-filling water bottle that mimics the Namib Desert beetle. Whether the technology is scalable to any significant degree remains to be seen.

    And finally, what about seawater? There's an abundance of that. If you ask a random sampling of folks in the street what we're going to do about water shortages on a larger scale, most of them will answer, "desalination." No problem. Well, yes problem.

    Desalination (sometimes shortened to "desal") plants are already widespread, and their output is ramping up rapidly. According to the International Desalination Association, in 2009 there were 14,451 desalination plants operating worldwide, producing about 60 million cubic meters of water per day. That figure rose to 68 million m3/day in 2010 and is expected to double to 120 million m3/day by 2020. That sounds impressive, but the stark reality is that it amounts to only around a quarter of one percent of global water consumption.

    Boiling seawater and collecting the condensate has been practiced by sailors for nearly two millennia. The same basic principle is employed today, although it has been refined into a procedure called "multistage flash distillation," in which the boiling is done at less than atmospheric pressure, thereby saving energy. This process accounts for 85% of all desalination worldwide. The remainder comes from "reverse osmosis," which uses semipermeable membranes and pressure to separate salts from water.

    The primary drawbacks to desal are that a plant obviously has to be located near the sea, and that it is an expensive, highly energy-intensive process. That's why you find so many desal facilities where energy is cheap, in the oil-rich, water-poor nations of the Middle East. Making it work in California will be much more difficult without drastically raising the price of water. And Nevada? Out of luck. Improvements in the technology are bringing costs of production down, but the need for energy, and lots of it, isn't going away. By way of illustration, suppose the US would like to satisfy half of its water needs through desalination. All other factors aside, meeting that goal would require the construction of more than 100 new electric power plants, each dedicated solely to that purpose, and each with a gigawatt of capacity.

    Moving desalinated water from the ocean inland adds to the expense. The farther you have to transport it and the greater the elevation change, the less feasible it becomes. That makes desalination impractical for much of the world. Nevertheless, the biggest population centers tend to be clustered along coastlines, and demand is likely to drive water prices higher over time, making desal more cost-competitive. So it's a cinch that the procedure will play a steadily increasing role in supplying the world's coastal cities with water.

    In other related developments, a small tech startup called NanOasis is working on a desalination process that employs carbon nanotubes. An innovative new project in Australia is demonstrating that food can be grown in the most arid of areas, with low energy input, using solar-desalinated seawater. It holds the promise of being very scalable at moderate cost.

    The Future

    This article barely scratches the surface of a very broad topic that has profound implications for the whole of humanity going forward. The World Bank's Ismail Serageldin puts it succinctly: "The wars of the 21st century will be fought over water."

    There's no doubt that this is a looming crisis we cannot avoid. Everyone has an interest in water. How quickly we respond to the challenges ahead is going to be a matter, literally, of life and death. Where we have choices at all, we had better make some good ones.

    From an investment perspective, there are few ways at present to acquire shares in the companies that are doing research and development in the field. But you can expect that to change as technologies from some of these startups begin to hit the market, and as the economics of water begin to shift in response to the changing global landscape.

    We'll be keeping an eye out for the investment opportunities that are sure to be on the way.

    While profit opportunities in companies working to solve the world's water woes may not be imminent, there are plenty of ways to leverage technology to outsized gains right now. One of the best involves a technology so revolutionary, its impact could rival that of the printing press.

    Disclosure: I have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

    Feb 25 9:35 PM | Link | Comment!
  • Breaking Down A Biotech Winner

    Traditional cancer treatment options are little more than a crude mix of "slash, burn, and poison" - that is surgery, radiation, and chemotherapy. There are radical new treatments in labs and trials all over the world that promise to throw out this trifecta; no other disease has received more of the research interest and funding that have defined modern biotechnology over the past three decades.

    I'm not going to tell you about any of those here. Sure, many of them will be wildly successful and make many investors fabulously wealthy over the next few decades. But most will fail. And those that don't will take a long time to turn a profit for investors.

    Yet, there is one small company whose unique twist on cancer treatment is proving to be a major upgrade. We profiled this company in a recent edition of Casey Extraordinary Technology, and it turned in a gain of over 167% for subscribers in just six months' time. It may yet make billions more still for investors.

    You see, in recent years chemotherapy has become the core treatment for most cancerous malignancies. And while these toxic cocktails of chemicals have proven effective at destroying cancerous cells, they also have one problem. A big one.

    Chemo, being essentially a poison, doesn't just attack cancerous cells - it attacks a broad range of healthy cells too. As a result, the treatment can sometimes be as harmful as the cancer itself in the short run. The side effects are awful, and its use can quickly erode patients' health. Some have even described chemo as a "cure that's worse than the disease."

    This sad state of affairs for the world's second most-prevalent chronic disease is why the cancer-research arena has been exploding over the past few years with the goal of developing more targeted, less-toxic therapies - in other words, to do a better job killing cancer cells while leaving healthy cells alone.

    That's exactly what Lawrenceville, New Jersey-based Celsion Corp. (NASDAQ:CLSN) has the technology to do. And chances are the company is on to one of the biggest cancer-treatment breakthroughs in decades.

    How It Works

    Our story starts with liposomes. These nanosized artificial vesicles are made from the same material as our cell membranes - natural phospholipids, i.e., a version of the chemicals that make up everything from fat to earwax, and cholesterol.

    Not long after their discovery in the 1960s, scientists began experimenting with liposomes as a means of encapsulating drugs, especially cancer drugs. Why? Something called the "enhanced permeability and retention" (NYSE:EPR) effect. This is a property of certain sizes of molecules - for example, liposomes, nanoparticles, and macromolecular drugs - which tend to accumulate in tumor tissue much more than they do in normal tissues. It's a useful feature for a cancer drug.

    Thus, they offer a potential way to combat the two biggest drawbacks of traditional chemotherapeutics: systemic toxicity and low bioavailability at the tumor site. In other words, the drugs now employed are themselves are toxic to normal cells, and they tend to get largely used up before they even reach the tumor site.

    Early attempts to encapsulate drugs inside liposomes did an okay job of dealing with the toxicity issue, but bioavailability at the tumor site was still limited. Our immune system saw to that. Just like virtually anything else artificial we put into our bodies, traditional liposomes were seen as invaders. Thus, they were rapidly cleared by the mononuclear phagocyte system, the part of the immune system centered around the spleen (yes, we do use it) that destroys viruses, fungi, and other foreign invaders.

    However, a breakthrough arrived when scientists came up with a new way to sneak these artificial compounds into the body undetected by our defenses. The process gave us what are call "PEGylated" liposomes, with a covalent attachment of polyethylene glycol polymer chains. The effect of attaching these little plastic chains to the end of the liposome was to create a "stealth" liposome-encapsulated drug that was hardly noticed by the system.

    Problem solved, right? Well, not exactly. A lot of hard work went into getting drugs into liposomes to reduce toxicity, then a bunch more into stopping our immune system from kicking in. But there was still yet another problem. The drug-release rates of these stealth liposomes were generally so low that tumor cells barely got a dose. Scientist had made them so stealthy that they even skated right by cancer cells, usually failing to kill off the tumors.

    After decades of experimenting with liposome-encapsulated cancer drugs, scientists still had not been able to safely deliver therapeutic concentrations of the chemotherapy drugs to all tumor cells.

    They had to devise a way to induce drug release when and where it would be more effective.

    The next big idea came in more recent years, as scientists devised temperature-sensitive liposomes. Heat them and they pop, releasing the drugs just when you need them to. From stealth to non-stealth in a matter of seconds, and right on target.

    Fortunately, they were able to make it work, but unfortunately, not at temperatures that didn't essentially cook patients from the inside - sort of defeating the purpose of keeping the chemo at bay to reduce collateral damage. They failed to perform at tolerable levels of heat or time. Fifteen minutes of baking and still only 40% or so of the drug was released, and it took temperatures up to 112° Fahrenheit. It might not sound like much, but it was enough to be intensely painful and damaging as well.

    That's where Celsion came in. It's designed and developed a novel form of these temperature-sensitive chemo sacks - the first of their kind to work effectively and safely - otherwise known as a lysolipid thermally sensitive liposome (LTSL).

    Celsion's liposomes are engineered to release their contents between 39-42° C, or 102.2-107.6° F (thus, another translation of LTSL has become "low-temperature sensitive liposome"). And they release the contents at an extremely fast rate, to boot.

    A Better Way to Use Chemo

    These unique properties of Celsion's LTSL technology make it vastly superior to previous liposome technology for a number of reasons.

    • For starters, the temperature range is much more tolerable to patients and won't injure normal tissue.
    • Second, the temperature range takes advantage of the natural effect mild hyperthermia has on tumor vasculature. Numerous studies have shown that temperatures between 39-43° C increase blood flow and vascular permeability (or leakiness) of a tumor, which is ideal for drug delivery since the cancer-killing chemicals have easy access to all areas of the tumor. These effects are not seen at temperatures below this threshold, and temperatures above it tend to result in hemorrhage, which may reduce or cease blood flow, hampering drug delivery. It's the Goldilocks Effect: The in-between range is perfect.
    • Third, Celsion's LTSL technology promotes an accelerated release of the drug when and where it will be most effective. That allows for direct targeting of organ-specific tumors.

    Celsion's LTSL technology has shown that it's capable of delivering drugs to the tumor site at concentrations up to 30 times greater than those achievable with chemotherapeutics alone, and three to five times greater than those of more traditional liposome-encapsulated drug-delivery systems.

    The company's first drug under development is ThermoDox, which uses its breakthrough LTSL technology to encapsulate doxorubicin, a widely used chemotherapeutic agent that is already approved to treat a wide range of cancers.

    Currently, ThermoDox is undergoing a pivotal Phase III global clinical trial - denoted the "HEAT study" - for the treatment of primary liver cancer (hepatocellular carcinoma, or HCC), in combination with radiofrequency ablation (NYSEMKT:RFA).

    RFA uses high-frequency radio waves to generate a high temperature that is applied with a probe placed directly in the tumor, which by itself kills tumor cells in the immediate vicinity of the probe. Cells on the outer margins of larger tumors may survive, however, because temperatures in the surrounding area are not high enough to destroy them. But the temperatures are high enough to activate Celsion's LTSL technology. Thus, the heat from the radio-frequency device thermally activates the liposomes in ThermoDox in and around the periphery of the tumor, releasing the encapsulated doxorubicin to kill remaining viable cancer cells throughout the region, all the way to the tumor margin.

    ThermoDox is also undergoing a Phase I/II clinical trial for the treatment of recurrent chest wall (RCW) breast cancer (known as the "DIGNITY study"), and a Phase II clinical trial for the treatment of colorectal liver metastases (the "ABLATE study"). But most of the drug's (and hence the company's) value is tied up in the HEAT study.

    The HEAT trial is a pivotal 700-patient global Phase III study being conducted at 79 clinical sites under a special protocol assessment (NYSE:SPA) agreement with the FDA. The FDA has designated the HEAT study as a fast-track development program, which provides for expedited regulatory review; and it has granted orphan-drug status to ThermoDox for the treatment of HCC, providing seven years of market exclusivity following FDA approval. Furthermore, other major regulatory agencies, including the European Medicines Agency (NYSEMKT:EMA) and China's equivalent, have all agreed to use the results of the HEAT study as an acceptable basis to approve ThermoDox.

    The primary endpoint for the HEAT study is progression-free survival - living longer with no cancer growth. There's a secondary confirmatory endpoint of overall survival, too. Both the oncological and investing community are eagerly awaiting the results, which are due any day now.

    So then, why are we on the sidelines now, right when the big news is due to hit? That all goes back to why Celsion was such a good investment to begin with, and what it can tell us about finding other big wins in the technology stock market.

    A Winner in the Making

    When we're looking for a strong pick in the biotechnology, pharmaceuticals, and medical devices fields - once we have established the quality of the technology itself and ensured it will likely work as expected - there is a simple set of tests we apply to ensure that we've found a stock that can deliver significant, near-term upside. The most critical of these are:

    • The technology must provide a distinct competitive advantage over the current standard of care and be superior to any competitors' effort that will come to market before or shortly after our subject's does. In other words, it must improve outcomes, by improving patients' length or quality of life (i.e., a cure for a disease, or a maintenance medication with fewer side effects), or lower costs while maintaining quality of care (i.e., a generic drug). A therapy that does both is all the better.
    • The market must be measurable and addressable. There must be some way to say specifically how many patients would benefit from a therapy, and to ensure that those patients have providers caring for them that would make efficient distribution of the therapy possible. For instance, a successful treatment for Parkinson's disease might be applicable to hundreds of thousands of patients, with little competition from other treatments, whereas a treatment for Von Hippel-Lindau (VHL) might only reach hundreds. If the goal is to recover years of research investment and profit above and beyond that, then market size matters, as do current and future competitors that might limit your reach within a treatment area.
    • Payers should be easily convinced to cover the new therapy at profitable rates. In the modern world of health care, failure of a treatment to garner coverage from government medical programs like Medicare and the UK Health Service, and private insurance companies (which generally cooperate closely to decide how to classify and whether to cover a treatment) is usually a game-ender. Payers have a responsibility not just to patients but to their shareholders or taxpayers to stay financially solvent. This means that if a therapy does not provide a compelling cost/benefit ratio, then it won't be covered. For instance, if you release a new painkiller that is only as effective as Tylenol and costs $1,000 per dose, you're obviously not going to see support.
    • There must a clear path to market in the short term, or another catalyst to propel the stock upward. An investment in a great technology does not always make for a great investment. You have to consider the quality of the management team and structure of the company, including its ability to pay the bills and get to market without defaulting or diluting you out of your positions. And of course, time. The biggest and most frequent mistake investors make in technology is assuming that it is smooth and short sailing from concept to market. Reality is much harsher than that, and in biotechnology and pharmaceuticals in particular - with a tough regulatory gamut to run - the timeline to take a new technology to market can be anywhere from a decade to thirty, forty, or even fifty years.

    Liposomes are a perfect example of that. Twenty years ago, I probably could have told you a story about a technology that was very similar to what was laid out above. It would be compelling and enticing to investors of all stripes - a breakthrough technology with the promise to revolutionize cancer care by making chemo less toxic and more effective at the same time. Yet had you invested in that promise alone, chances are you'd be completely wiped out by now, or maybe - just maybe - still waiting for a return.

    That is why we invest in proof, not promises. So, how does Celsion stack up against our four main proof points?

    Time to market: When we first recommended Celsion, it was in Phase III pivotal trials. This is the last major stage of human testing usually required before a company can submit an FDA New Drug Application and apply to market the product.

    The process of bringing a drug to market, even once a specific compound has been identified and proven to work in vitro (in the lab), is perilous. Many things can go wrong along the way. If you look at investing in a company whose drugs are just entering Phase I clinical trials, for instance, it is still unclear if the therapy is effective in vivo (in the human body). This is a critical stumbling block for many companies, whose promising compounds immediately prove less effective or more dangerous than testing suggested. Even if Phase I goes well, it can take up to a decade and sometimes longer to get from there to market with a drug. And then, even Phase II trials often leave treatments five or more years from market - though there are exceptions in cases where a therapy is proven very effective or a disease has so few treatment options available. But shortcuts are rare, and investors have to consider the time and expense (which leads to fundraising and ultimately dilutes your return) of getting from A to Z.

    In this regard, Celsion made a uniquely great investment. When we first recommended the company, it was in the midst of a pivotal Phase III trial and looked to be about a year or so away from its first commercialization. (Though, speaking to the length of these trials, this one had been started back in 2008.)

    With many of the most high-profile companies in the industry - those working on vogue treatment areas and conditions, like hepatitis C treatments of late - when they get this close to market, the large banks bid up stocks to high levels, content to squeeze just a few percentage points out at the end. They have to be conservative, since they're investing large amounts of other people's money. However, biotechnology is such a fragmented space with far more companies than Wall Street can possibly cover in depth, that coming across a gem like Celsion late in the game with a potentially big win is not as uncommon as you'd think. The "efficient market" hypothesis fails to account for the fact that no one can know everything, including every stock. And Celsion had gone all but unnoticed for some time.

    Payer acceptability: Celsion has the benefit of developing a 2.0-style product, an improvement over something that already exists. RFA is already in relatively widespread use and has proven effective enough that most every insurance and benefits provider will cover it. Even the early generations of LTSL, while not quite as safe or effective as desired, were enough of a benefit to gather pretty solid support from payers.

    Celsion, through its clinical trial process, has proven its unique blend is safer, better tolerated by patients, and much more effective than its predecessors. Thus, payer support at a reasonable price is a pretty sure bet.

    Market size: When we originally recommended Celsion, we stated that the company was sitting on a multibillion-dollar opportunity. And we stand by that statement. However, just because something is eventually worth that amount does not mean it's bankable today as a short-term investment. So we try to keep our analysis narrowly focused on what can be directly counted on and measured. In Celsion's case, that's the Phase III treatment, Thermodox, and the one area in which it is being studied: primary liver cancer (NYSE:HCC). Even just in this narrow band, however, we see the market opportunity for Celsion as in excess of $1 billion.

    HCC is one of the most deadly forms of cancer. It currently ranks as the fifth most-common solid tumor cancer, and it's quickly moving up. With the fastest rate of growth among all cancer types, HCC projects to be the most prevalent form of cancer by 2020. The incidence of primary liver cancer is nearly 30,000 cases per year in the US, and approximately 40,000 cases per year in Europe. But the situation worldwide is far worse, with HCC growing at approximately 750,000 cases per year, due to the high prevalence of hepatitis B and C in developing countries.

    If caught early, the standard first-line treatment for primary liver cancer is surgical resection of the tumor. Early-stage liver cancer generally has few symptoms, however, so when the disease is finally detected, the tumor is usually too large for surgery. Thus, at least 80% of patients are ineligible for surgery or transplantation by the time they are diagnosed. And there are few nonsurgical therapeutic treatment options available, as radiation and chemotherapy are largely ineffective.

    RFA has emerged as the standard of care for non-resectable liver tumors, but it has limitations. The treatment becomes less effective for larger tumors, as local recurrence rates after RFA directly correlate to the size of the tumor. (As noted earlier, RFA often fails at the margins.) ThermoDox promises the ability to reduce the recurrence rate in HCC patients when used in combination with RFA. If it proves itself in Phase III, there's no doubt the drug will be broadly adopted throughout the world once it is approved.

    A quick look at the numbers: According to the most recent data from the National Cancer Institute, the incidence rates of HCC per 100,000 people in the three major markets are 4 in the US, 5 in Europe, and approximately 27 in China. Based on these incidence rates, the total addressable market in these three regions (which we will conservatively assume to be the total addressable worldwide population for the time being) is approximately 400,000 (12,000 in the US, 40,000 in Europe, and 351,000 in China).

    Assuming that 50% of HCC patients are eligible for nonsurgical invasive therapy such as RFA, approximately 200,000 patients worldwide would be eligible for ThermoDox. Further assuming an annual cost of treatment for ThermoDox of $20,000 in the US, $15,000 in Europe, and $5,000 in China, in line with similar treatments of the same variety, we estimate that the market potential of ThermoDox could be up to $1.3 billion. Not to mention the countless thousands of lives saved. (And that's before the rest of the developing world comes online.)

    Of course, this is an estimate of ThermoDox's potential assuming 100% market penetration - something that simply never happens. While we expect ThermoDox in combination with RFA to become the standard of care for primary liver cancer, a more reasonable expectation for maximum market penetration after a six-year ramp-up to peak sales (from an expected approval in 2013) is probably 40%.

    Improving outcomes or lowering costs: This is exactly what the Phase III trial was intended to prove: efficacy beyond a shadow of a doubt. Given preliminary data and earlier trial results, it was already a pretty sure thing, so in our model, we assumed about a 70% chance of success (to be on the conservative side, as always - it's better to be right by a mile than to miss by an inch).

    Once we incorporate that probability of success into our model, we come to a probability-weighted peak sales figure in 2019 of approximately $365,000,000 annually.

    The average-price-to-sales ratio among the big players in biotech these days is about 5. If we apply a sales multiple of 3 (i.e., just 60% of the average) to Celsion's probability-weighted peak sales for ThermoDox in 2019, we come up with a value for the company of nearly $1.1 billion, which would equate to about $33 per share if it did not issue any new stock between now and then - that's more than 17 times where the stock was trading when we recommended a buy.

    And remember, these numbers are only for ThermoDox under the HCC indication.

    Our Move to the Sidelines

    With final data from the current Phase III pivotal trial due expected to come in within the next few weeks, Celsion's stock has ballooned in value from the $2 range to $7.50 or so in the past few weeks. Now, that's a far cry from the $33 price we mentioned above, but remember, that's a target for 2019. And it doesn't allow for a whole range of things that could go wrong.

    Chief among those concerns is that the Phase III data come in more poorly than expected. Even just a small variance in efficacy or a simple question about safety can knock a few hundred million dollars off those sales figures. Or it can push trials back a year or two, delaying returns and sending short-term-minded investors, like those who have recently bid up CLSN shares, retreating to the hills for the time being.

    Further downfield there is sure to be competition as well, and of course we may get those miraculous chemo-free treatments mentioned up front.

    In short, we don't have a crystal ball and can't tell you what the world will look like in 2019. If you believe yours is clear, ask yourself if you thought touchscreen phones and tablets would outsell traditional computers by 3 to 1 globally in 2012. If not, you might want to give the crystal a polish.

    To be clear, the value of Celsion in the near term hinges on a binary event - the results of the ongoing HEAT trial. We are of the opinion that CLSN represents one of the best opportunities we've come across since we started this letter, and that the probability of a successful trial is high. Nevertheless, there is substantial down side if the trial is unsuccessful. And it could take years to recover, if ever, on news of a delay from any concerns raised.

    We'd already advised subscribers to take a free ride early on in our coverage of the stock, taking all of the original investment risk away. However, even with that protection, the short-term potential is still more heavily weighted to the down side. Thus, we booked our profits and stepped to the sidelines on this one.

    Celsion continues to be a model, even at today's prices, for a great biotech investment with significant upside potential. But we're content to wait for the market to hand us another, similar opportunity.

    The pages of Casey Extraordinary Technology are filled with investments just like Celsion - up-and-coming technology companies the market has yet to discover. With 2012 coming to a close, the service's track record for the year is a remarkable 9 winners out of 9 closed positions, with an average gain of 61%. Get in on it now: subscribe today and save 25% off the regular price - as always, backed by our unconditional money-back guarantee.

    Disclosure: I have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

    Dec 11 2:47 PM | Link | Comment!
Full index of posts »
Latest Followers

StockTalks

More »
Posts by Themes
Instablogs are Seeking Alpha's free blogging platform customized for finance, with instant set up and exposure to millions of readers interested in the financial markets. Publish your own instablog in minutes.