Lean Environment

One of the best rules anybody can learn about investing is to do nothing, absolutely nothing, unless there is something to do... I just wait until there is money lying in the corner, and all I have to do is go over there and pick it up... I wait for a situation that is like the proverbial ‘shooting fish in a barrel.’
— Jim Rogers
  There's gold in 'em thar Big Data hill!

There's gold in 'em thar Big Data hill!

  Lean Data: A better way to save bullets?

Lean Data: A better way to save bullets?

The principles of marksmanship have much to teach us about accuracy and precision of financial trading. Accuracy is commonly understood as the ability to hit the target, whereas precision is defined as its repeatability. While saving bullets may not be a marksman’s top concern, compensating, accounting, or controlling for environmental factors that affect practical accuracy is an important aspect of good marksmanship. The effects of key environmental factors such as temperature, humidity, air density, precipitation and wind all need to be understood separately and their combined effects estimated by a good marksman. This is of course assuming that the marksman already knew his own intrinsic accuracy based on his inherent ability, as well as the inherent mechanical accuracy of both his firearm and ammunition. After all, the intrinsic accuracy of the combination of rifle, ammunition and shooter is determined by the intrinsic accuracy of the worst performing of these components. A stable, predictable (or even controlled) environment helps narrow the scope of variability so a marksman can aim and fire a series of shots with the greatest accuracy and precision.


Without precision, a high degree of accuracy is nearly impossible. In order to accurately strike a target, the shooter must adjust the aim to account for several variables. To the extent the repeatability of the firearm or weather conditions is poor, this adjustment becomes correspondingly uncertain – in other words, with poor precision the shooter simply will have to guess, and hope for the best. This is hardly the recipe for success.

In financial trading, precision comes from the ability to model the external macroeconomic environment and understand how it affects the overall performances of a wide range of trading strategies, e.g., trend-following, mean-reversion, statistical arbitrage or other quantitative strategies. Given the inherent constraints on the number of environmental variables that we can practicably monitor and track, it makes sense to concentrate our resources on a core set of key parameters with the widest applicability. Macroeconomic variables such as exchange rates, interest rates, inflation, unemployment, GDP, etc. would fall within the core set of such key parameters. We call this the “Lean Data” approach, as opposed to its better known counterpart, i.e., the “Big Data” approach, which might have many more types of variables, in addition to macroeconomic variables, that may also include weather, geological data or even consumer sentiments. As we shall see, more is not necessarily better, just as free is not necessarily useless.

Suppose that we have n Big Data variables from which a subset of k Lean Data variables can be chosen. As there are n!/k!(n-k)! ways to choose k elements from a set of n elements, we see that this can easily support an ecologically diverse niche along the HFT vs. Big Data continuum. Multiple trading firms with differentiated survival strategies can thus be accommodated.

The objective of a Lean Data approach to trading is to achieve statistical repeatability of a desirable outcome in the aggregate based on executing specific strategies over a large number of times within a stable and predictable environment. In other words, we aim for precision in financial trading, knowing that, as in marksmanship, precision naturally leads to accuracy. However, there is, as they say, no free lunch in finance. What we are giving up here for precision is to forgo trade opportunities that we are unsure about (i.e., false negatives), based on a quick analysis of costs and benefits in real time each trade opportunity arises. In times of uncertainty, we simply do not trade. While we are unable to predict the outcome of each specific trade, we can reasonably expect to achieve a positive result in the aggregate outcome of all executed trades. Trading strategies based on statistical arbitrage, or to a lesser extent, mean reversion, would exhibit this type of behavior.

In times when good data for supporting precision is unavailable, the Lean Data approach has another trick up its sleeve that may be readily applied. Compared with the highly accurate sniper rifle of the Big Data approach (where availability of good data is a pre-requisite), the Lean Data approach can sometimes behave like a machine gun that may not be very accurate to start with. However, a machine gun using the same ammunition as the sniper rifle can be effective at a much greater range due to lower accuracy requirements for effective use. Therefore, the machine gun's target circle is much larger due to its rapid-fire capability, which allows a machine gun to strike with one or more hits along with numerous misses at random locations within the target circle. Again, there is no free lunch in finance. What we are trading away here for achieving a greater range is to tolerate a larger number of misses (i.e., false positives) from lack of good data. Provided we control for the environmental factors, and correctly calculate in real time that the greater benefits of a few hits to be more than the total costs of all misses plus the costs of all the bullets, we can still expect an overall positive financial outcome. Trading strategies that make lots of small bets on large-impact (aka "Black Swan") events the market considers unlikely would fall under this category. In a more limited sense, a trend-following strategy with trailing stops that suffered through multiple stop-outs before finally hitting a trend would be considered as exhibiting a similar behavior.

One can thus see that the Lean Data approach can support a broad class of well-known trading strategies, depending upon the availability of good data that reflect market conditions. By choosing to concentrate our resources on monitoring and tracking a core set of macroeconomic parameters, we ensure that our models and strategies are reasonably informed of the external environmental factors that affect trading performances in the aggregate.

 Vertigo-Inducing Big Data.

Vertigo-Inducing Big Data.

 Lean Data:  Home Sweet Home .

Lean Data: Home Sweet Home.

 HFT Environmental Hazards.

HFT Environmental Hazards.

An interesting way of thinking about the Lean Data approach is to visualize it as an environment in which Scrat the saber-tooth squirrel of the ice age is struggling to hold on to his prized acorn. The Big Data environment is vertigo-inducing and scary; whereas the High-Frequency Trading environment is filled with piranhas and hazardous. The small world environment of Lean Data, while no Scratlantis, is nevertheless familiar, cozy and comforting to Scrat (and his prized acorn). But are macro models by themselves sufficient for driving Lean Data financial trading in a Big Data world without suffering too much of a handicap?

One plausible answer, we think, is to consider how the macro models might have already captured the compact structure of the macroeconomic world, and thus can support causal reasoning in a variety of trading models where the external environment exerts influence. Computer scientists generally believe that a sufficiently compact program that explains a complex world essentially captures reality. As Eric Baum, author of the book “What is Thought?”, explains: “the only way one can find an extremely short computer program that makes a huge number of decisions correctly in a vast and complex world is if the world actually has a compact underlying structure and the program essentially captures that structure.”

Therefore, if trading models constrain their reasoning and learning to deal only with meaningful quantities (i.e., vetted by a diverse network of human economists and codified into macro models), their decisions and actions would more closely correspond to macroeconomic reality. Furthermore, if machines, like humans, understand the world through meaningful concepts and only search through meaningful possibilities, the load on computational resources would be more manageable. In other words, human experts through their collective research efforts can provide the metaphorical DNA to the machines, giving it a “running start” and preempting the re-invention of the proverbial wheel thus saving valuable computational resources. This is how we envision the Lean Data approach can make an important contribution to financial trading, by showing a knowledge paradigm where machines and a network of human experts can synergistically collaborate.

An expert is someone who knows some of the worst mistakes that can be made in his subject and how to avoid them.
— Werner Heisenberg (“Physics and Beyond”, 1971)


  1. Taleb, Nassim Nicholas (2010). The Black Swan: The Impact of the Highly Improbable (2nd Edition). Random House.
  2. Baum, Eric B. (2006). What is Thought? A Bradford Book.