Tags

, , , , , , , , , , , ,

Source: theplanningmotivedotcom

MACHINES LEARN, BUT FIRST THEY HAVE TO BE TAUGHT TO LEARN

The world and our brains are analogue, computers are binary. Analogue is infinitely more complex than binary, which cannot be emulated by binary systems including the qubits found in so called quantum  computers.  Binary  systems, because  they  do  not  spontaneously  form  imprints of  the external  world,  need  programming. This  programming  can  only  be  undertaken  in  the  first instance by humans with all their fallibilities.

The purpose of this article is to begin examining the role played by computer algorithms in assisting and even replacing financial and investment decision making. It was inspired by the lead article in the Economist Magazine (5th–11th October 2019) titled Masters of the Universe. It poses the question, as to whether these algorithms are leading or misleading investors, and, whether they will be blamed for either delaying or precipitating the inevitable crash of financial markets.

To  better  understand  the  limitations  of  converting  what  is  an  analogue  world  into  a  digitally represented world, there can be no better vehicle than the difference between vinyl records and CDs (or latterly streaming). Vinyl records are analogue. Their accuracy in terms of mechanically duplicating music  as  it  emerges, depends  on  the  quality  of  the  microphone  which  captures  the  sound,  the recording equipment, the durability and quality of the medium on which it is imprinted, and the quality of  the  equipment  which  reproduces  the  sound.    It  is  possible,  using  a  non-PVC  resin  to  produce  a medium which is more durable, and which more faithfully imprints the sound. But that is by and by. Music lovers tend to favour vinyl or analogue music because it is “warmer”.

The reason for this is that the conversion of analogue into digital, requires sampling. This process is called ADC and it involves quantization, in this case, an industry limit of 44,000 samples per second is used in which analogue sound is broken down into digital bytes. Thus the ADC cannot hope to produce a perfect imprint because this is mathematically too complex. This necessarily results in what is called “jaggedness”. For example under a strong magnifying class, digital smartphone pictures resolve into individual pixels whereas a microscope’s view of a slide will reveal smaller and smaller details until its limits are reached.

Thus what algorithms do, is to sample the outside world or more often, data streams. Algorithms may be presented as the future “eye of god”, but they are not, nor will they ever be. The world is far too complex and fluid for that. Rather they are focused mathematics constructed by human’s for inquiring into desired traits and  connections. Where algorithms have been given more or less free  rein, they invariably return with misleading results, which are only discovered after much time  and effort has been wasted by human researchers on verifying their results.

Furthermore, algorithms do not have a life of their own. They are not immaculately conceived. They embody the limitations, prejudices and biases of their programmers. Their tunnel or skewed digital vision is not accidental, they are the product of the recognised or unrecognised, chauvinism, racism, religiosity, nationalism of their  programmers.  Nowhere  is  this  truer  than in the  realm  of  financial algorithms, where programmers   with their “vulgar economics” and false   assumptions, blind algorithms.

The blind leading the blind.

The two articles in the Economist reveal that machines are taking over investment decisions especially in US Stock markets which are valued at $31 trillion, a fabulous sum. “Funds that are run by computers that follow rules set by humans account for 35% of America’s stockmarket, 60% of institutional equity assets and 60%  of  trading  activity” (My  emphasis.)  And again: “Exchange-traded funds  (ETFs)  and mutual  funds which automatically track indices…had $4.3 trillion invested…exceeding  the  sums actively traded by humans for the first time. ” Finally: “Three years ago quant funds became the largest source of institutional trading volume…They account for 36% of institutional volume so far this year…” Quant funds are interesting, because they more than any, base decisions on quantitative sampling.

The Economist is of the view that all this computerisation has improved the efficiency of the market because it has lowered transaction costs and the speed of the transaction together with the delivery of the instrument being traded. But the purpose of the market is not efficiency, it is discovery. As the Economist says in its opening sentence: “The job of capital markets is to process information so that savings flow to the best projects and firms.” In the words of Marxists, the job of investment (credit) markets is to direct investment from industries and firms with below average rates of profit to firms and industries with above average rates of profit.

This seems very simple. How difficult is it to write such an algorithm? Extremely difficult because it is chaotic,  and  impossible, unless  it  is guided  by  a  theory  of  capitalism  that  is  real  and  evolving.  The Economist gives a nod in that direction. It quotes Bryan Kelly of Yale University who investigated the efficacy of purely machine derived factors which in the end turned out to be spurious. “He  says combining  machine  learning  with  economic  theory  works  better.” So it is economic theory that converts machine learning from yielding spurious results into one that yields “profitable” results.

The reason for this is that economic theory concerns itself with how the economy  works, and until you  have  worked  out  the how you  cannot  know  the what to  look  for. As  the Economist quotes elsewhere: “If you apply an algorithm to too large a data set often it reverts to a very simple strategy, like ‘momentum’”. And “A machine learning strategy that does not employ human logic is bound to blow up eventually if it is not accompanied by deep understanding”. The problem with programmers, even  ones  led  by  bourgeois  economists,  is  that  they  have  no  fundamental  understanding  of  the capitalist economy[,] only a superficial understanding.

Being a flat earther does not prevent one from building a house, because for all intents and purposes, a  thirty  square  meter  plot,  unless  it  is  on  the  side  of  a  mountain  is  essentially  flat.  However, if  an engineer seeks to build a 3 km suspension bridge (s)he can no longer be a flat earther because that bridge cannot be built unless the curvature of the earth is taken into account. If an engineer seeks to build a GPS satellite, it is not enough to know the earth is a sphere with curvature, it is necessary to know that this curvature is no longer uniform, and neither is earth’s magnetism. Thus depending on the requirement, a more involved and deeper understanding is required. Flat earther economists can get away with day to day observations of the market. More strategic ones detect the lumpiness  in  the  market.  But  what  they  all  lack  is  that  deep,  deep  understanding  of capitalism. That is why with few exceptions, they failed to detect the Crash of 2008 which so enraged Queen  Elizabeth.  And  which  is  why  they have  failed  to  understand  the  significance  of the current tremors pre-dating the new Crash.

In  the  old  days  it  was  said  of  computers,  that  if  garbage  went  in  garbage  would  come  out.  Today algorithms intervene filtering out some of the garbage going in. We thus have to modify the rubbish in rubbish  out slogan. Instead  we  should  say, a  rubbish algorithm  ensures  that  data  going  in  is converted into garbage coming out.

There  is  an  additional  problem  in  a  market  driven  by  algorithms.  It is  the  noise  they  generate.  The Economist misses  the  point when it  quotes  seasoned  investors  who  worry  that  algorithms  chase securities with given characteristics only to dump them when these securities become too expensive breaching the “given characteristics”. This is a minor element. We recall that  algorithms  are  data driven. But these algorithms are themselves altering the data sets whenever they execute orders in the market. There is thus an enormous amount of looping going on. If enough of them modify the data by adding to it, they create a massive false positive. If we take the case of momentum, this means they add to the momentum which adds to the momentum regardless of economic logic.

What should we call these kinds of superficial algorithms? The correct term is psychological algorithms which reflects the psychology of the market and those that populate it. The fact that these algorithms are mathematically complex and burn lots of computer time is by and by. Maths without the correct assumptions is like a car without a steering wheel. Elegant but directionless.

It  appears  the Economist has  a  premonition  of the pending algorithm  led Crash. “The greatest innovations  in  finance  are  unstoppable,  but  often  lead  to  crashes  as  they  find  their  feet.” Here  the Economist is  not  only  looking  forward  but  backward  to  the  bubbles  created  by advent joint  stock companies in the 19th Century that gave rise to things such as the overinvestment in railways.

Where  the Economist is  wrong  is  to  declare “computers  don’t  panic”.  This is a  misreading  of  the situation. The psychology of the market is the catalyst which in the end determines, whether greed or fear, will prevail. Thus algorithms which dig into these attitudes are not without momentary merit. Marx made the important observation: that consciousness tends to lag objective reality. Greed tends to  extend  the  bull  run,  whereas  fear  tends  to  extend  the  bear  run. In  short, changing  conditions precede the recognition  of  these changes and  even  more, revised  decisions  based  on  this  new recognition. Often these new conditions or developments have  to become  established before they attract attention or affect decision making.

In  this  context  psychological  algorithms  are  a  double  edged  sword. On the one  hand  they  could reinforce the psychology of the market by both chasing momentum and adding to it, long after the fundamentals  have  changed.  On  the  other  hand,  they  could  be  programmed  to  detect  changes  in conditions faster than the human brain, that is the brain of the investor, can detect provided they are looking in the right direction. In the former case they would perpetuate the existing run, in the latter limit it. It all depends on which set of programmes are dominant and which carry the most financial weight. If the former set of programmes are in force, and the statistics given earlier in the form of ETFs suggest they are, then these algorithms will cause a panic by driving the markets off a higher cliff than would have been the case, were seasoned investors in charge. They could set off an unprecedented panic as algorithms collectively reverse their trades and head for the exists more quickly and brutally than human investors ever could.

In the aftermath of the 2008 Crash, Mr Greenspan, who had been the long standing chairman of the FED, admitted that he did not understand sub-prime instruments due to their complexity. No doubt in 1919 and 1920 in the aftermath of this crash, the same bullshit will be admitted to. We are bound to hear the same sorry individuals declaring: ”we did not understand the algorithms, not only were they too complex but they had a life of their own”.

When is a trade war deal not a deal, and, when is quantitative not actually quantitative easing?

The  markets  have  recovered  their  mojo over  the  last  week.  This has  happened,  not  because fundamentals have improved, they continue to deteriorate, but because the FED has become more interventionist and a trade deal has been “achieved”.

In  2008, it  was  the  major  investment  banks  that  were  propping  up  the  markets,  especially  the mortgage market despite mortgage defaults skyrocketing. These days it is the government, especially the White House and  the FED. No doubt all the algorithms have been tasked to focus on monetary policy  and  the  trade  conflict.  They are  alert  to  any  developments with high  frequency  traders determined to be the first off the block. But all is not as it seems. Take the trade deal, it was anything but, it was a truce, an agreement not to escalate, therefore a deferred agreement. But to algorithms seeking out key words, Trump’s valorous triumph was sufficient. The maths did not get the nuances or maybe they will over the weekend as more critical human comments emerge in cyberspace.

The FED itself plays a nuanced game. On Friday morning Bank  of America opined “the Fed  needs a bazooka of asset purchases.However, they said, that’s unlikely to happen, and the central bank will probably buy only $25 billion to $50 billion a month in Treasury bills, “to guard against the perception of QE.” That afternoon the FED did unpack its bazooka, announcing monthly $60 billion purchases of assets, having already spent $180 billion propping up the REPO market which it promised to continue supporting. https://finance.yahoo.com/news/fed-brings-bazooka-fight-repo-170254896.html

According to the FED, these are not crisis measures, merely technical measures. “These actions are purely technical measures” and “purchases of Treasury bills likely will have little if any impact on the level of longer-term interest rates and broader financial conditions.”Why is the FED being so coy and going to such lengths to fool the algorithms? If the FED was to admit to crisis conditions this would undoubtedly spook the markets and confuse the algorithms. Instead it says it is adding reserves not easing monetary policy because the unwinding of its balance sheet was overdone.

Hmmm. The graph below suggests this cannot be correct. Prior to the 2008 Crash the assets of the Federal Reserve was under $1 trillion or in inflation adjusted terms, around $1 trillion.  At its height, up to 2018, it stood at $4.5 trillion. The unwinding up to the REPO emergency amounted to just $700 billion leaving the assets at $3.78 trillion or nearly 400% higher than prior to the 2008 Crash.

Graph 1.

It may of course be argued that since 2008 the economy has grown significantly and public debt more so, due to growing fiscal deficits. However, even when a comparative analysis is done, the results still indicate  that  the  cause  of  the  growing  financial  crisis  is  not  the  one  provided  by  the  FED.  Graph  2 provides the result of this comparative analysis. Public debt may be elevated at $22 trillion currently, but so  too  is the  FED’s  balance  sheet.  Admittedly, the  ratio  has declined 50%  since  2015, but Fed Assets as a share of Federal Debt are still 70% higher than the level obtaining before the 2008 crash.

Graph 2.

(Sources: Fred Table GFDEBTN for Public Debt & Table WALCL for FED Assets.)

The  real  problem  is  the  age old  one  for  capitalist  firms and  banks,  lack  of  liquidity. FactSet’s latest earnings insight (11th October) projects annual revenue growth of only 2.7% with profit margins down 2%.  This combination is lethal  for  cash  flow.  In  reality,  adjusted  for  share  buy  backs,  inflation  and creative  accounting,  this  implies  a  sharp  reduction  in  both  cash  flow  and its unpaid  element.  In a previous posting I highlighted the issues facing the tight oil industry in the USA where negative cash flow is endemic. It is likely the same applies to an increasing number of industries and the banks that make up their deficits.

Conclusion.

The next three weeks, which constitutes the reporting season for the third quarter, will be a stern test for the algorithms. The profit motive is the  driver of  capital. It cannot be ignored  and when it is, it comes around with doubled force. At the beginning of year the projection was for a rebound in profits in the second half of the year. The opposite has materialised, profits have fallen and will continue to fall despite the truce in the trade war. The question is, will the algorithms give due weight to this or will they continue to focus on monetary policy and trade war tweets. If they do, this will only delay the crash, but at the expense of engorging it. We shall know shortly.

                               Brian Green 13th October 2019