Wall & Main

Icon

My perspectives as an investor and consumer

A glimpse into the future VI

RobertTurner_LightBulbThis week’s look into the world of cutting edge research:

  1. Herbal extract inhibits the development of pancreatic cancer. An herb appears to inhibit development of pancreatic cancer as a result of its anti-inflammatory properties, according to researchers from the Kimmel Cancer Center at Jefferson who presented their work at the 100th meeting of the American Association of Cancer Research.  Thymoquinone, the major constituent of the oil extract from a Middle Eastern herbal seed called Nigella sativa, exhibited anti-inflammatory properties that reduced the release of inflammatory mediators in pancreatic cancer cells, according to Hwyda Arafat, M.D., Ph.D.  “Not only patients with chronic pancreatitis could benefit from this, but also several other groups with risk of development or recurrence of pancreatic cancer, such as high-risk family members and post-surgical patients. These potent effects show promise for the herb as a potential preventive and therapeutic strategy for pancreatic cancer,” Dr. Arafat said.
  2. Reversible generation of high capacity hydrogen storage material demonstrated. Researchers at the U.S. Department of Energy’s Savannah River National Laboratory have created a reversible route to generate aluminum hydride (alane), a high capacity hydrogen storage material. This achievement is not only expected to accelerate the development of a whole class of storage materials, but also has far reaching applications in areas spanning energy technology and synthetic chemistry.  For years, one of the major obstacles to the realization of the hydrogen economy has been hydrogen storage. Solid-state storage, using solid materials such as metals that absorb hydrogen and release it as needed, has many safety and practicality advantages over storing hydrogen as a liquid or gas.  Alane possesses the desired qualities but until now, had been considered impractical because of the high pressures required to combine hydrogen and aluminum to reform the hydride material.
  3. ‘Designer’ immune cells created with the help of microscopic ‘beads’ ignore transplanted organs. The future of organ transplantation could include microscopic beads that create “designer” immune cells to help patients tolerate their new organ, Medical College of Georgia researchers say. The degradable microparticles deliver the most powerful known form of HLA-G, a natural suppressor of the immune response, straight to dendritic cells, which typically show the immune system what to attack. The microparticles are given right after a transplant, just as dendritic cells are giving the immune system a heads up to get busy attacking the new organ. Unlike current anti-rejection drugs that generally suppress the immune system – leaving patients vulnerable to infections, cancer and more – HLA-G offers specific “tolerance.” Marked microparticles also have treatment potential in diseases where the immune system attacks normal tissue, such as arthritis, multiple sclerosis and inflammatory bowel disease.
  4. New approach to scrambling light may lead to sharper images, wider views. When photographers zoom in on an object to see it better, they lose the wide-angle perspective — they are forced to trade off “big picture” context for detail. But now an imaging method developed by Princeton researchers could lead to lenses that show all parts of the scene at once in the same high detail.  The new method could help build more powerful microscopes and other optical devices.  This method addresses the shortcomings of small apertures by taking advantage of the unusual properties of substances called nonlinear optical materials. In conventional lens materials such as glass or plastic, rays of light pass through without interacting with one another. In nonlinear materials, light rays mix with each other in complex ways.  The image from a nonlinear lens would therefore be rich in detail once the information is unscrambled, which is what the researchers have been able to accomplish.
  5. Physical reality of string theory demonstrated. String theory has come under fire in recent years. Promises have been made that have not been lived up to. Leiden theoretical physicists have now for the first time used string theory to describe a physical phenomenon. Their discovery has been reported in Science Express.  Electrons can form a special kind of state, a so-called quantum critical state, that plays a role in high-temperature super-conductivity.  Super-conductivity only seemed possible at very low temperatures close to absolute zero, but more and more examples are coming up where it also occurs at higher temperatures.  However, no one had managed to explain this phenomenon.  The quantum-critical state occurs in a material just before it becomes super-conductive at high temperature, where the electrons exhibit the same behavior at small quantum mechanical scale or at macroscopic human scale.  An aspect of string theory has been used by the Leiden physicists to shed light on this phenomenon.

Filed under: Health, Science, Technology, ,

What does energy independence mean?

BillONeil_Adakame Does the implementation of energy independence and environmental stewardship need to be linked?  Can each aspect be tackled separately or in some order of priority?

Energy independence, considered on its own, would mean that the sources of energy that feed our consumption come from within the United States.  Environmental stewardship would involve lower emission of pollutants and greenhouse gases during the production and consumption of each additional Btu of energy.

We currently seem to be tackling two massive issues at the same time with aspects of each pulling in opposite directions and hindering the advancement of either.   It seems to me that we would be able to step closer to accomplishing both if we tackle energy independence first.  When we control the input-output elements of our nation’s energy, we can better direct our efforts towards environmental stewardship.

The schematic below provides a summary of energy sources, and energy consumption, in the United States in 2007 as compiled from Energy Information Administration data.  In order to compare the various sources, I’ve converted the normal units of each into Btu.  The colors representing the various sources are consistent within the consumption sectors.  This will, hopefully, facilitate quick comprehension of the dynamics between the energy sources and consumption sectors.

The United States consumed 101.6 quadrillion Btu of energy in 2007.  85% of the energy we consumed (in Btu) was provided by petroleum (39%), natural gas (23%), and coal (22%).  64% of our petroleum consumption and 16% of natural gas consumption came from imports.  The United States is a net exporter of coal.

US_Energy_Picture_2007In order to wean ourselves off imported petroleum and natural gas – energy independence – while reducing pollutants and greenhouse gases at the same time – environmental stewardship, it would require the replacement of petroleum and natural gas with renewable energy.  Herein lies the challenge of tackling these two major issues at the same time.

Since 96% of our transportation needs are met by petroleum-derived products, the replacement fuel from renewable energy sources would have to serve as transportation fuel.  This practically eliminates solar, wind, geothermal, and hydro-electric or 48% of 2007 renewable energy production.  Only 17.4% of the 3.6 quadrillion Btu of biomass directly contributed to transportation through fuel ethanol and biodiesel.   There would, therefore, have to be a 41 fold increase in the production of fuel ethanol and biodiesel to replace current petroleum imports, not to mention the complications associated with setting up infrastructure for the replacement fuel.

I don’t pretend to be an expert in energy infrastructure nor claim to have the perfect solution.  However, one viable alternative I see would be to focus the efforts of renewable energy on the nation’s electric grid.  As can be seen in the schematic, the electric power sector is one area where all sources have some contribution, especially natural gas and renewable energy.  More importantly, almost all sources of renewable energy can be used for the generation of electricity.

Natural gas, according to various sources, is an abundant resource within our country and cleaner burning than petroleum or coal.  Measuring natural gas in the ground is no easy job, and it involves a great deal of inference and estimation. With new technologies, these estimates are becoming more and more reliable; however, they are still subject to revision.  Three different organizations – Energy Information Administration, National Petroleum Council, and Potential Gas Committee – estimate that the technically recoverable natural gas resources are about 1.5 quadrillion cubic feet.  In comparison we produced 19.3 billion cubic feet of natural gas in 2007.  In other words, our technically recoverable resources of natural gas are 77,000 times greater than the total production in 2007.  Technically recoverable resources include both discovered and undiscovered resources that we currently have the technical capability to extract.  Keep in mind that just because a resource is technically recoverable doesn’t mean that it is economically recoverable, where the parties involved have a financial incentive to extract it.  Update: On June 18, 2009, the Potential Gas Committee updated its resource numbers for natural gas to 1.84 quadrillion cubic feet.

If clean coal technology is more of a story than a reality, we could focus our efforts on unconventional natural gas – the most abundant form in the United States – and renewable energy (solar, wind, geothermal, hydro-electric, and biomass) such that they become the major contributors to each additional Btu added to the electric power sector.

Transportation, on the other hand, would seem to require one source of power the way gasoline has served consumers and diesel has served commercial vehicles in the past.  The infrastructure that is needed to support fuel consumption – filling stations, fuel transport – is the main reason for such a requirement.  The ability to go to any gas station and fill one’s vehicles with the same kind of fuel made the adoption of cars and light trucks easier for consumers.  I believe the same will be required for the next source of power for consumer vehicles.  Electricity is one such option.

Consumer-based transportation (light trucks and smaller vehicles) can gradually be shifted to run on electricity through improved battery technology and recharge stations.  8.8 million gallons of gasoline and 3.4 million gallons of diesel were consumed for transportation purposes last year.  Since most of the gasoline is used in cars and light trucks, replacing it with electric power should be sufficient to eliminate the importation of petroleum as the US production of petroleum should be adequate to meet the needs of heavy duty trucks and industrial applications.

Shai Agassi is one of the leaders in the area of electric vehicle services, such as recharging stations, and is currently implementing his vision in Israel.

Experts estimate that the capacity of the electric grid will have to double in order to facilitate a major shift towards electric vehicles, which is why I believe it important to focus all current natural gas and renewable energy efforts to increasing electric capacity.

An added benefit is the the ability to change the mix of sources for electric power generation down the road as we continue to make strides in renewable energy in terms of efficiency and scale.  For example, we may be able to wean ourselves off coal and nuclear followed eventually by natural gas.  No matter what the input mix and how long it takes us to get there, consumers can rest assured that the output will remain the same – electricity, which will continue to power their homes and vehicles into the future.

Filed under: Energy, Government, Politics, , , , , , ,

A glimpse into the future V

Webster_Messenger1.  A new target may help maintain healthy blood pressure. In trying to understand the role of prostaglandins – a family of fatty compounds key to the cardiovascular system – in blood pressure maintenance, researchers at the University of Pennsylvania School of Medicine discovered that mice that lack the receptor for one type of prostaglandin – PG F2-alpha – have lower blood pressure and less atherosclerosis than their non-mutant brethren. The normal role for PG F2-alpha is to increase blood pressure and accelerate atherosclerosis, at least in rodents.  Targeting this pathway could represent a novel therapeutic approach to cardiovascular disease.  Results were published in the Proceedings of the National Academy of Sciences.  The delicate balance the body maintains to keep blood pressure stable involves not only the prostaglandin system, but another biological pathway, the renin-angiotensin-aldosterone system, or RAAS.  Mice lacking the PG F2-alpha receptor also showed reduction in levels of renin, angiotensin I, and aldosterone, a biological situation leading to lower blood pressure.

2.  A smartphone can now serve as an ultrasound imager. William D. Richard and David Zar, researchers at Washington University in St. Louis, are bringing the minimalist approach to medical care and computing by coupling USB-based ultrasound probe technology with a smartphone, enabling a compact, mobile computational platform and a medical imaging device that fits in the palm of a hand.  It is now possible to build smartphone-compatible USB ultrasound probes for imaging the kidney, liver, bladder, and eyes, endocavity probes for prostate and uterine screenings and biopsies, and vascular probes for imaging veins and arteries for starting IVs and central lines.  The vision of the new system is to train people in remote areas of the developing world on the basics of gathering data with the phones and sending it to a centralized unit many miles, or half a world, away where specialists can analyze the image and make a diagnosis.  A typical, portable ultrasound device may cost as much as $30,000. Some of these USB-based probes sell for less than $2,000 with the goal of a price tag as low as $500.

3.  Blood cells can be reprogrammed to act as embryonic stem cells. In a recent study, U.S. researchers have reprogrammed cells found in circulating blood into cells that are molecularly and functionally indistinguishable from embryonic stem cells which provide a readily accessible source of stem cells and an alternative to harvesting embryonic stem cells. Embryonic stem cells have long been coveted for their potential to treat a multitude of diseases as a result of their unique properties of self-renewal and pluripotency (the ability to develop into any type of cell in the body), but their use has been the subject of political controversy.  To generate induced pluripotent stem cells (dubbed iPS cells), scientists isolated CD34+ cells – a type of stem cell that produces only blood cells – from blood samples.  The CD34+ cells were infected with viruses carrying reprogramming factors that can reset the blood cells to an embryonic state. The colonies of cells exhibited physical characteristics similar to embryonic stem (ES) cells and expressed the same markers as ES cells.

4.  A super-fast 167-processor chip is ultra energy-efficient. A new, extremely energy-efficient chip, containing an array of 167 processors, that provides breakthrough speeds for a variety of computing tasks has been designed by a group at the University of California, Davis. The chip, dubbed AsAP, is ultra-small, fully reprogrammable and highly configurable, so it can be widely adapted to a number of applications. The chip is designed for digital signal processing. While not the principal kind of processor chip used in desktop computers, digital signal processing chips are found in a myriad of everyday and specialized devices such as cell phones, MP3 music players, video equipment, anti-lock brakes and ultrasound and MRI medical imaging machines.  Twelve chips working together could perform more than half-a-trillion operations per second (.52 Tera-ops/sec) while using less power than a 7-watt light bulb – up to 10 times the speed of currently available chips while decreasing power consumption up to 75 times.  Details of the chip design have been published in IEEE Journal of Solid State Circuits.

5.  The future of infrastructure could be in self-healing concrete. A concrete material developed at the University of Michigan can heal itself when it cracks. No human intervention is necessary, just water and carbon dioxide.  Self-healing is possible because the material is designed to bend and crack in narrow hairlines rather than break and split in wide gaps, as traditional concrete behaves.  Self-healed specimens recovered most, if not all, of their original strength after researchers subjected them to a 3 percent tensile strain.  It’s the equivalent of stretching a 100-foot piece an extra three feet – enough strain to severely deform metal or catastrophically fracture traditional concrete.  Traditional concrete will fracture and cannot carry a load at .01 percent tensile strain.  Today, builders reinforce concrete structures with steel bars to keep cracks as small as possible. But they’re not small enough to heal, so water and deicing salts can penetrate to the steel, causing corrosion that further weakens the structure.  The self-healing concrete needs no steel reinforcement to keep crack width tight, so it eliminates corrosion.

Filed under: Health, Science, Technology, ,

Scientifically exploring solutions to health care

BrianDettmer_WheelAs America considers major health care reforms, can its crafting of policy benefit from scientific guidance?    Ryan Moore, co-author of a study published in The Lancet, a leading international medical journal, thinks so.  Seguro Popular is Mexico’s ambitious plan to improve healthcare for its estimated 50 million uninsured citizens.  The publication was the culmination of a collaborative study of Seguro Popular between Mexican health officials and researchers from leading American universities.  As a result of this study, U.S. policymakers are encouraged to scientifically explore solutions to America’s own looming healthcare crisis – an experimental approach with the potential for providing objective answers to even the most controversial and politically charged questions.

“If the administration has done arms-length science and has involved third parties, like the researchers who were involved in this study, then the case that the administration can make for continuing these programs is much stronger,” said Moore, an assistant professor of political science at Washington University in St. Louis. “They’re more likely to get at the truth – it’s good politics and it’s good science.”

The article, “Public Policy for the Poor? A Randomized Assessment of the Mexican Universal Health Insurance Program,” details a massive, two-year field experiment designed to evaluate Mexico’s push to bring better healthcare to communities ranging from remote villages to crowded urban areas. The study turned dozens of Mexican communities into real-world laboratories where causal effects of the insurance program could be empirically measured and evaluated at the household level as new services rolled out in phases across seven Mexican states: Guerrero, Jalisco, Estado de Mexico, Morelos, Oaxaca, San Luis Potosi and Sonora.

Moore and colleagues developed the experimental design, wrote public-use software to implement it and then “tied their own hands” by publishing a preliminary study detailing exactly how the experiment and analysis would be carried out – a process designed to insulate findings from after-the-fact political meddling.

Researchers identified 74 matched pairs of communities that shared similar demographic and health conditions, and worked with Mexican officials to conduct household surveys capturing a baseline snapshot of each community’s health status. Then, working independent of the Mexicans, researchers randomly selected one from each matched pair of communities for early introduction of Seguro Popular, establishing a controlled framework in which individual changes in health experiences in one community could be empirically compared to control conditions in the matching community.

“This was the largest randomized health policy evaluation ever undertaken,” Moore said. “We the researchers were involved in experimental design, and in charge of data collection and analysis at the other end. Mexican officials had no control over the results and we had full freedom to publish what we found.”

Residents in test areas were encouraged to enroll in Seguro Popular, and participating Mexican states received funds to upgrade medical facilities and improve access to health services, preventive care and medications. Follow-up surveys show the program is making a difference on its primary objective, documenting a 23 percent reduction in families experiencing catastrophic health expenditures.

According to Moore, “If money is put into a program targeting the poor to receive health insurance, and if that program is well structured, then the poor can actually see reductions in the amount they pay out of pocket for health care. That may seem obvious, but it’s not. Designing a program that’s targeted in a certain way may not mean that resources actually reach the people it’s intended to reach.”

In fact, the Lancet study identified areas where Seguro Popular needs improvement, showing it’s been slow in reaching some residents. Surprisingly, researchers found no measurable, first-year effect on medication spending, health outcomes or utilization of health services. The bottom line, Moore said, is that without objective empirical evaluations of new programs, it’s difficult to say whether funds are being spent effectively.

“This example of arms-length field experimentation and policy evaluation demonstrates how social science can contribute to bettering individuals’ lives,” said Moore. “A great deal can be gained when policymakers are willing to let science steer the evaluation process, when they’re willing to subject themselves to the possibility of being wrong. When they do that, not only is better public policy made in the long run, but we have a stronger case to make for successful policies in the short run.”

Moore is confident the Seguro Popular evaluation template could be used to guide healthcare reforms now contemplated by the Obama Administration. He points to the State Children’s Health Insurance Program, known as SCHIP, as an example of legislation that already incorporates incentives for states to experiment with funding and services. Some Medicare reform plans encourage experimentation as a way to answer questions about what works best, both on cost and quality of care.

If America wants to be ready to make large-scale changes in its health system, now is the time for small-scale testing. “If researchers are allowed to select these test areas — using scientifically and statistically valid methods -– we’ll be able to use experimental methods to do good science, to cut through the politics and get the answers we need,” Moore said. “We can get at truth using these randomized experiments.”

Filed under: Government, Health, Politics, , , , , , ,

Posting hiatus

I will be out of station for about a week starting today.  I don’t expect to be able to post any articles while I’m away.  Fret not.  New ideas are in the works and new articles will be up by the end of next week.  Cheers!

Filed under: Uncategorized

A glimpse into the future IV – Cancer Research

uramchoe_collageSince May is recognized by the United States Congress as National Cancer Research Month, I’m dedicating this edition of “A glimpse into the future” to the men and women working tirelessly to address cancer prevention, diagnostics, treatment, and survivorship.  This post is also dedicated to my mom who is a cancer survivor and has been in remission for over eight years.  She is a woman of immense faith, courage, and inner strength and a great inspiration to me.

  1. Cancer rates have been predicted to grow dramatically over the next 20 years. The number of new cancer cases diagnosed annually in the United States will increase by 45 percent, from 1.6 million in 2010 to 2.3 million in 2030, with a dramatic spike in incidence predicted in the elderly and minority populations.  The overall population is expected to grow by 19 percent during the same period (from 305 million to 365 million).  These statistics were presented by researchers from The University of Texas M. D. Anderson Cancer Center in the Journal of Clinical Oncology.  Regarding disease-specific findings, the leading cancer sites are expected to remain constant – breast, prostate, colon and lung. However, cancer sites with the greatest increase in incidence expected are: stomach (67 percent); liver (59 percent); myeloma (57 percent); pancreas (55 percent); and bladder (54 percent).  These findings also highlight that the cost of cancer care is growing at a rate that may not be sustainable and a dire need of new medical oncologists entering the health care system.
  2. The “longevity” protein takes on tumors. Scientists at the Mayo Clinic’s Department of Oncology have identified another anti-cancer effect of the “longevity” protein SIRT1. By speeding the destruction of the tumor promoter c-Myc, SIRT1 curbs cell division. The study was published in the Journal of Cell Biology. The yeast and nematode equivalents of SIRT1 are fountains of youth that stretch lifespan. Whether SIRT1 slows aging in mammals isn’t certain, but it’s beneficial in other ways. The protein tunes up metabolism, reducing blood levels of glucose and insulin, and might forestall neurodegenerative illnesses such as Alzheimer’s disease and ALS.  Yuan et al. determined SIRT1’s effect on the transcription factor c-Myc, whose expression surges in many breast, colon, and liver cancers. The two proteins are tangled in a regulatory loop, the team found. c-Myc latched onto SIRT1’s promoter, spurring cells to manufacture more SIRT1. In turn, SIRT1 detached acetyl groups from c-Myc, hastening its breakdown.
  3. Brain tumor growth may be fought by reversing the effects of an altered enzyme. An international team of scientists from the Moores Cancer Center at the University of California, San Diego, the University of North Carolina and several institutions in China have explained how a gene alteration can lead to the development of certain types of brain tumors – low grade gliomas and secondary glioblastomas, and they have identified a compound – alpha-KG – that could staunch the cancer’s growth. The researchers have shown that when a mutated enzyme fails to do its job, the development of tumor-feeding blood vessels increases, allowing more nutrients and oxygen to fuel cancer growth. They have also shown in the laboratory that they could reverse the mutant enzyme’s effects, effectively blocking this process, called angiogenesis, and provide a potential future treatment strategy against some types of brain tumors. They reported their findings in the journal Science.
  4. The inhibitor of an insulin-like growth factor receptor may reduce the growth of pancreatic cancer. Researchers at Amgen are testing a fully human monoclonal antibody that inhibits the activity of insulin-like growth factors (IGF-1 and IGF-2) and appears to reduce pancreatic cancer cells in early testing, according to a report in Molecular Cancer Therapeutics, a journal of the American Association for Cancer Research. Pancreatic cancer is one of the deadliest cancers, and less than 4 percent of the 200,000 patients diagnosed annually live more than five years. The only available clinical treatment is gemcitabine, but this has yet to show a survival benefit. It is known that insulin-like growth factors play a role in cancer development, particularly in mediating cell survival. According to Amgen, AMG 479, a fully human anti-IGF-1 monoclonal antibody, is the first drug that specifically targets the receptor for these growth factors without cross-reacting with the closely related insulin receptor.
  5. Some promise seen in treating previously drug-resistant prostate cancer. A new therapy for metastatic prostate cancer has shown considerable promise in early clinical trials involving patients whose disease has become resistant to current drugs.  According to research presented in Science Express, the drugs are second-generation antiandrogen therapies that prevent male hormones from stimulating growth of prostate cancer cells. The new compounds – manufactured by the pharmaceutical company Medivation and known as MDV3100 and RD162 – appear to work well even in prostate cells that have a heightened sensitivity to hormones. That heightened sensitivity makes prostate cancer cells resistant to existing antiandrogen therapies.  Of 30 men enrolled in a multisite phase I/II trial designed to evaluate safety, 22 showed a sustained decline in the level of prostate specific antigen (PSA) in their blood. Phase III clinical trials are planned to evaluate the drug’s effect on survival in a large group of patients with metastatic prostate cancer.

Filed under: Health, Science, , ,

Stressing out the banks

juliedavidow_stress01

One of the programs created under the TARP is the Capital Assistance Program (CAP).  The CAP was designed to promote confidence in the financial system by ensuring that the nation’s largest banks have sufficient capital cushion against larger than expected future losses.  A stress test, called the Supervisory Capital Assessment Program (SCAP),  was crafted and implemented by the Federal Reserve and the Treasury to aid them in assessing said capital cushion.  Details of the stress test were released to the public by the Federal Reserve on April 24th, through a 21-page white paper.  I present here a synopsis of the white paper.

All domestic bank holding companies (BHCs) with year-end 2008 assets exceeding $100 billion were required to participate in the SCAP.  According to ProPublica, 19 firms fell under this requirement.  They are listed in the second table below.  These 19 firms hold 66% of the assets and 50% of the loans in the US banking system.  They were asked to project their losses, and available resources for absorbing these losses, for 2009 and 2010 based on two economic scenarios — a baseline scenario and an adverse alternative.  The table below lists the components of the economic scenarios and the effect of the baseline and adverse conditions on each of them.  The supervisors, then, assessed whether their capital was adequate for them to function during this period.

fedeconomicscenariosfor2009_2010

Step 1: Loss Projections. BHCs were asked to project losses for 2009 and 2010 for 12 separate categories of loans held in the accrual book, for loans and securities held in the available-for-sale (AFS) and held-to-maturity (HTM) portfolios, and in some cases for positions held in the trading account.  The losses were to be consistent with the economic outlooks in the baseline and more adverse scenarios.  The BHCs were instructed to estimate forward-looking, undiscounted credit losses, that is, losses due to failure to pay obligations (“cash flow losses”) rather than discounts related to mark‐to‐market values.  The required assessments were broadly classified as:
19largestbanksbyassets20091

  • First and Second Lien Mortgages: institutions provided detailed descriptions of their residential mortgage portfolio risk characteristics – type of product, loan-to-value (LTV) ratio, FICO score, geography, level of documentation, year of origination, etc.
  • Credit Cards and Other Consumer Loans (e.g., auto, personal, student): portfolio information included FICO scores, payment rates, utilization rates, and geographic concentrations.
  • Commercial and Industrial Loans: based on the distribution of exposures by industry
  • Commercial Real Estate Loans: included loans for construction and land development, multi-family property, and non-farm non-residential projects.  Information such as property type, loan-to-value ratios, debt service coverage ratios, geography, and loan maturities was provided.
  • Other Loans: farmland lending, loans to depository institutions, loans to governments, etc.
  • Securities in AFS and HTM Portfolios: majority are public-sector securities such as Treasury securities, government agency securities, sovereign debt, and high-grade municipal securities. Private-sector securities include corporate bonds, equities, asset-backed securities, commercial mortgage-backed securities (CMBS), and non-agency residential mortgage-backed securities (RMBS).  Supervisors focused on evaluating the private-sector securities.  Loss estimates were based on an examination of 100,000 of these securities.  Loss estimate, and subsequent “write-down” to fair value, for each security was determined based on credit loss rates on the underlying assets, consistent with loss rates for unsecuritized loans listed above.
  • Trading Portfolio Losses: estimated by applying market stress factors to the firm’s trading portfolio based on actual market movements that occurred between June 30 and December 31, 2008.
  • Counterparty Credit Risk: the risk that an organization is unable to pay out on a credit-related contract when it is supposed to, which directly impacts a firm’s earnings and the value of its assets.  The action taken by the firm to account for this risk is referred to as credit valuation adjustment (CVA).  Supervisors focused specifically on a firm’s loss estimates for mark-to-market losses stemming from CVA associated with market shocks applied to assets in trading books.

Step 2: Resources to Absorb Losses. Institutions were also instructed to provide projections of resources available to absorb losses under the two economic scenarios.  These include the pre-provision net revenue (PPNR) and the allowance for loan losses over the two-year horizon.

  • PPNR is the income after non-credit-related expenses that would flow into the firms before they take provisions or other write-downs or losses.
  • BHCs supposedly had some allowance for loan and lease losses at the end of 2008.  They were required to estimate what portion of this allowance would be required to absorb potential future credit losses on their loan portfolio under each economic scenario.  This calculation could either result in depletion of the year-end 2008 reserves (if there is adequate allowance) or indicate the need for building the reserves (if the allowance is inadequate).

Step 3: Determination of Necessary Capital Buffer. Supervisors examined two main elements as indicators of capital adequacy – pro forma equity capital and Tier 1 capital.

  • Pro forma equity capital was estimated by rolling tax-adjusted net income (PPNR minus credit losses minus reserve builds) for the two-year horizon through equity capital.
  • Tier 1 capital is composed of common and non-common equity, with the dominant component being common stockholder’s equity.

The initial assessment of the capital adequacy, or lack thereof, was conveyed to the BHCs in late April and is expected to be released to the public on May 4th, 2009.  As yet it is uncertain whether the publicized results will reveal much about the banks.

Filed under: Business, Economy, Government, , , , ,

A glimpse into the future III

lyndi-0-lHere is this week’s look into the world of cutting edge research:

  1. A therapeutic target in Alzheimer’s disease identified. Work led by Professor Mark Pepys FRS over more than 20 years has identified a protein known as serum amyloid P component (SAP) as a possible therapeutic target in Alzheimer’s disease.  In collaboration with Roche he developed a new small molecule drug, CPHPC, which specifically targets SAP and removes it from the blood. In the exciting new work reported in the Proceedings of the National Academy of Sciences, the Pepys team has shown that the drug also removes SAP from the brains of patients with Alzheimer’s disease. In this first study of the drug in patients with Alzheimer’s disease, CPHPC was given to 5 individuals for 3 months.  There was the usual depletion of SAP from the blood, seen in all subjects receiving this treatment, but also remarkable disappearance of SAP from the brain. Laboratory tests revealed for the first time the way in which SAP accumulates in the brain in Alzheimer’s disease.
  2. Nanosensors could lead to development of highly sensitive security and medical devices. Scientists have designed tiny new sensor structures that could be used in novel security devices to detect poisons and explosives, or in highly sensitive medical sensors, according to research published in Nano Letters. The new ‘nanosensors’, made of gold or silver, are about 500 times smaller than the width of a human hair. One is shaped like a flat circular disk while the other looks like a doughnut with a hole in the middle. When brought together they interact with light very differently than they do on their own.  This difference in the interaction with light is affected by the composition of molecules in close proximity to the structures.  The device could be tailored to detect different chemicals by decorating the nanostructure surface with specific ‘molecular traps’ that bind the chosen target molecules. Once bound, the target molecules would change the colors that the device absorbs and scatters, alerting the sensor to their presence.
  3. Understanding and preventing the movement of tumor cells. Tumor cells that lack a certain protein can become extremely mobile and “adept” at penetrating healthy tissue to form metastases. Scientists at the Pharmacology Institute of the University of Heidelberg have identified this protein as the previously unknown cell signal factor SCAI (suppressor of cancer cell invasion).  When the factor’s functioning was disrupted, the cancer cells moved much more effectively. They adapted to the consistency of the respective tissue by changing their shapes constantly and attaching flexibly to surrounding tissues during movement with the help of special surface structures (receptors).   One of these receptors is known as b1-integrin. Suppression of SCAI causes b1-integrin to be overactive and the tumor cell to take on an aggressive form.  The discovery of SCAI, presented in the prestigious journal Nature Cell Biology, could be an interesting starting point for research into new mechanisms for fighting cancer.
  4. Advances in organic LED’s may provide cheap and efficient natural light. Roughly 20 percent of the electricity consumed worldwide is used to light homes, businesses, and other private and public spaces. Though this consumption represents a large drain on resources, it also presents a tremendous opportunity for savings. Improving the efficiency of commercially available light bulbs — even a little — could translate into dramatically lower energy usage if implemented widely. In the Journal of Applied Physics, a group of scientists at the Chinese Academy of Sciences is reporting an important step towards that goal with their development of a new type of light emitting diode (LED) made from inexpensive, plastic like organic materials. Designed with a simplified “tandem” structure, it can produce twice as much light as a normal LED — including the “natural” white light desired for home and office lighting.  Progress in this area promises further reduction in the price of organic LEDs.
  5. Doctors look to an inexpensive drug to relieve fibromyalgia pain. Fibromyalgia is a disorder classified by chronic widespread pain, debilitating fatigue, sleep disturbance and joint disorder. Advocates and doctors who treat the disorder, estimate it affects as much as 4 percent of the population.  In a small 14-week pilot study at Stanford, patients were given a low dose of a drug called naltrexone for the treatment of chronic pain.  The drug, which has been used clinically for more than 30 years to treat opioid addiction, was found to reduce symptoms of pain and fatigue an average of 30 percent over placebo, according to the results of the study published in the journal Pain Medicine.  “Patients’ reactions were really quite profound,” said senior author Sean Mackey, MD, PhD, associate professor at Stanford University Medical Center. Still, Mackey and his colleagues remain cautious about recommending the drug this early on in the research process.  The researchers are moving ahead with a second, longer-term trial of 30 patients who will be tested during a 16-week period.

Filed under: Health, Science, Technology, ,

Understanding the PPIP

davidellis-flow1-07sOn March 23, 2009, the Treasury Department released details of the Public-Private Investment Program (PPIP), which is one of the programs under the TARP aimed at restoring financial stability.

The details of the program were complicated enough to elicit a standalone post.

The Problem. According to the Treasury, one of the problems plaguing the financial system is that of legacy assets – real estate loans held directly by the banks (“legacy loans”) and securities, or tradable financial instruments, backed by loan portfolios (“legacy securities”).  The true value of these assets has been brought to question. As a result, there is uncertainty surrounding the balance sheets of the institutions holding these assets.  Markets don’t like uncertainty as is evident from their performance, especially that of bank stocks, over the last 12 months.

Proposed Solution. The program’s intent is to repair the balance sheets of these institutions by moving the legacy assets off the hands of banking institutions and into the hands of investors.  Cleaner balance sheets could make it easier for banks to raise capital and increase their willingness to lend.

Principles of the Proposal. Treasury will use $75 – $100 billion in TARP money to become co-investors with the private sector, with backing provided by the FDIC and the Federal Reserve.  The “investment partnership” will, thus, be able to purchase $500 billion to $1 trillion of toxic mortgage assets (both residential and commercial) from banking institutions that currently carry them.

There are two separate approaches, one for legacy loans and the other for legacy securities.  Initially, Treasury will share its $75 – $100 billion equity stake equally between the two programs with the option of shifting the allocation towards the option with the greater promise of success with market participants.

  1. Legacy Loans Program: Suppose a bank has a pool of mortgages with $100 face value that it wants off its hands.  It approaches the FDIC, which determines whether it wants to leverage the pool at a 6-to1 debt-to-equity ratio.  If it decides to go ahead, the pool is auctioned by the FDIC.  Several private sector investors are hoped will bid.  The private sector bidder with the highest bid would be the winner and would form a Public-Private Investment Fund to purchase the pool of mortgages.  Let’s say the highest bid is $84.  Of this purchase price, the FDIC would guarantee $6 of every $7 in investment, or in this example, $72.  This is debt financing.  The remaining $12 is equity financing, which is shared equally by the private investor ($6) and the Treasury ($6).  The private investor would then manage the servicing of the pool of purchased assets.  The private investor and the Treasury would each be able to purchase $14 worth of assets for every $1 of their own money (14-to-1 leverage).
  2. Legacy Securities Program: Treasury will approve up to five fund managers for this program.  A fund manager, for example, submits a proposal and is pre-qualified to raise private capital.  The Treasury plans on being a joint-venture partner.  Let’s say a fund manager is able to raise $100 of private capital for the fund.  The Treasury will co-invest $100 in equity financing along side the private investor and will provide an additional loan of $100 (debt financing) to the “partnership” fund.  Additional requests for loans up to $100 will also be considered by the Treasury.  As a result, the fund manager has $300 – $400 in total capital for purchase of securities.  The fund manager has full discretion in investment decisions.  This program will be incorporated into the previously announced Term Asset-Backed Securities Loan Facility (TALF) whose original goal was to provide debt financing (non-recourse loans) to buyers of newly created consumer and small business loans.

The PPIP program has thus far met with lukewarm reception from the private sector, which is noteworthy given the necessity of their participation.  On a conference call to analysts and investors last Thursday, Jamie Dimon, the CEO of JP Morgan and, currently, the go-to guy for the US government on financial mergers/takeovers, said that they will not take part in the PPIP.  “We’re certainly not going to borrow from the federal government because we’ve learned our lesson about that,” said the Chief Executive.  Wells Fargo and US Bancorp are noncommittal.

It remains to be seen whether the Public-Private Investment Program will succeed in getting credit flowing again or further reveal corporate fear of business decisions driven a government that’s looking solely in the rearview mirror.

Filed under: Business, Economy, Government, , , ,

A glimpse into the future II

ashbee_work02In this second issue of what I intend as a weekly tradition, I present five recent research findings from diverse areas:

  1. Stem cell therapy grows new blood vessels. Research led by David Hess at The University of Western Ontario has identified how to use selected stem cells from bone marrow to grow new blood vessels to treat diseases such as peripheral artery disease.  It’s one of the severe complications often faced by people who’ve had diabetes for a long time.  Reduced blood flow (ischemia) in their limbs can lead to resting pain, trouble with wound healing and in severe cases, amputation. The research is published in Blood.  These stem cells have a natural ability to hone in on the area of ischemia to induce blood vessel repair and improve blood flow.  The preclinical data from Hess’ research was used by a biopharmaceutical company, Aldagen (www.aldagen.com ) to receive FDA approval for a multi-center clinical trial now underway in Houston, Texas, involving 21 patients with end-stage peripheral artery disease.  “These principles could be applied not only to ischemic limbs, but to aid in the formation of new blood vessels in ischemic tissue anywhere in the body, for example after a stroke or heart attack.” says Hess.
  2. Algae could help in manufacture of solar panels that are simpler and more efficient. Engineers at Oregon State University have discovered a way to use an ancient life form to create one of the newest technologies for solar energy.  The secret: diatoms.  These tiny, single-celled algae have dye-containing rigid shells that can be used to generate electricity in a natural way at a nanoscale.  Researchers have created a new way to make “dye-sensitized” solar cells, in which photons bounce around inside the shells like they were in a pinball machine, striking these dyes and producing electricity.  This technology may be slightly more expensive than existing approaches to make dye-sensitized solar cells, but can potentially triple the electrical output.  These solar cells work well in lower light conditions and offer manufacturing simplicity and efficiency.  The process involves letting the diatoms settle on a transparent conductive glass surface.  The living organic material is removed, leaving behind the tiny skeletons of the diatoms to form a template.  Titanium dioxide is then precipitated creating a thin film semiconductor for the dye-sensitized solar cell device.  This process was presented in ACS Nano.
  3. If you think current microprocessor fabrication is impressive, think again. The ability to create tiny patterns is essential to the fabrication of computer chips and many other current and potential applications of nanotechnology. Yet, creating ever smaller features, through a widely-used process called photolithography, has required the use of ultraviolet light, which is difficult and expensive to work with.  John Fourkas, Professor of Chemistry and Biochemistry at the University of Maryland, and his research group have developed a new, table-top technique called RAPID (Resolution Augmentation through Photo-Induced Deactivation) lithography that makes it possible to create small features without the use of ultraviolet light. This research was published in Science magazine.  Nanofabrication has depended on short wavelength ultraviolet light to generate ever smaller features.  RAPID lithography allows the creation of patterns twenty times smaller than the wavelength of light employed and structures that are 2500 times smaller than the width of a human hair.  RAPID is expected to find applications in areas such as electronics, optics, and biomedical devices.
  4. What if you could repair the damaged cells following a heart attack? A protein that the heart produces during its early development reactivates the embryonic coronary developmental program and initiates migration of heart cells and blood vessel growth after a heart attack, researchers at UT Southwestern Medical Center have found. The molecule, Thymosin beta-4 (TB4), is expressed by embryos during the heart’s development and encourages migration of heart cells. The new findings in mice suggest that introducing TB4 systemically after a heart attack encourages new growth and repair of heart cells. The study appears in the Journal of Molecular and Cellular Cardiology.  “This molecule has the potential to reprogram cells in the body to get them to do what you want them to do,” said Dr. J. Michael DiMaio, associate professor of cardiothoracic surgery at UT Southwestern and senior author of the study. Obviously, the clinical implications of this are enormous because of the potential to reverse damage inflicted on heart cells after a heart attack.”
  5. The dream of a Hydrogen Economy moves one step closer to reality. The design of efficient systems for splitting water into hydrogen and oxygen underpins the long term potential of hydrogen as a clean, sustainable fuel. But man-made systems that exist today are very inefficient and often require additional use of sacrificial chemical agents.  Prof. David Milstein and colleagues of the Weizmann Institute’s Organic Chemistry Department demonstrated a new mode of bond generation between oxygen atoms and even defined the mechanism by which it takes place.  Their results were recently presented in Science magazine.  The new approach is divided into a sequence of reactions, which leads to the liberation of hydrogen and oxygen in consecutive thermal- and light-driven steps, mediated by a “smart” complex consisting of a metal core – ruthenium – and an outer organic part.  They were able to demonstrate the production of hydrogen gas, oxygen gas, and reversion of the metal complex to its original state.  For their next study, they plan to combine these stages to create an efficient catalytic system, bringing those in the field of alternative energy an important step closer to realizing this goal.

Filed under: Health, Science, Technology, ,

Follow

Get every new post delivered to your Inbox.