network

Hi all,

The statistics look more and more ominous (for some) each passing year as the American public moves further from mainstream news and further into the alternative media. Americans are flocking in droves to the likes of alternative news sites on the internet and, in particular, to alternative news productions on youtube. Statistics confirm that comments sections in mainstream news website articles are more influential of public opinion than the articles themselves. It is as if there is some kind of wholesale rejection of mainstream thought. What is behind this trend? To explain this, a brief digression into political ideology is needed, but I’ll make it brief.

Prior to about the 14th Century western culture was dominated by a normative confirmation bias that pervaded human thought: religion was the normative confirmation bias that shaped and defined how we viewed and understood our world. Virtually nothing in people’s lives was unaffected by this lens. Whether it was science, society, family, politics or any other matter, religion framed how we understood everything. Even explaining the cosmos had to pass a normative confirmation bias test; for if whatever idea one advanced about the workings of the cosmos was incongruent with the religious lens, it was rejected. And within western religion there were differing “flavors”, different sects of religion which occupied the populace in endless debate over which one represented the real “truth”. But oddly, what seldom occurred to those living in this time was that religion itself might not be necessary. While there were atheists in that day and age for sure, for most people it was not widely known or understood that one could actually just not believe in religion (or “god”) itself, whether that meant continuing to believe in a god in your own way or not believing at all. To promote this idea widely could severely jeopardize one’s own self-preservation, so wherever the view was held, it was held in private and the very idea that religion itself was not needed to understand our world was not a widely discussed debate. Some scholarly elitists debated it as time went on, but this was mostly after the Enlightenment, and much more so beyond the 14th Century. But that brings us to the next point. The idea that we might explain our world without a need for religion didn’t suddenly appear overnight. It gained wider popularity in the general population gradually over time. This breakout into public discourse began during the Enlightenment and slowly expanded over the next two or three hundred years.

But what seems to have escaped the attention of the larger discourse was the fact that normative confirmation bias dies a hard death. To explain this, we should first try to characterize religion more generally in this context so as to provide a more objective way of identifying normative confirmation bias, whatever guise it may assume in different times and places. The key feature of religion vis-à-vis normative confirmation bias was the fact that religion provided a general theory by which the universe could be explained and one only needed to deduce specific conclusions from that theory to know how to interact with their world. It was, in a sense, a simplex way of thinking. But what the enlightenment taught us was that there was another way to understand our world, a duplex way of thinking. Instead of applying a general theory as a given, we could empirically observe the world around us to induce – vice fabricate of whole cloth – a theory of how our world works. Once such a theory is established, we could then deduce from that how to interact with our world. This was the birth of empirically driven reason. By observing our world in a series of “experiments” we could from those specific cases induce a general understanding of how the universe worked. Once established, knowing how to rationally interact with our world became a simple matter of deduction. Thanks to established theory, we then had a predictive tool to guide our actions. What this lesson should have taught us was that general theories not founded in empirical observation were the mother of normative confirmation bias. But alack, we did not learn this lesson. We merely rejected religion (to some degree of approximation) and failed to apply this same logic to everything else. And that is partly due to the fact that the tools of empirical observation were limited in the scope of subjects to which they could be applied. Anything that escaped our capacity to empirically characterize therefore, remained in place as a potential “mother” of normative confirmation bias.

As the years wore on, our capacity to empirically measure previously vague and inscrutable matters such as sociology improved. While this capacity to empirically measure our sociological world, our understanding of how to manage society based on an understanding of that sociology remained fixed: we continued to base decisions on the management of human society on theories derived whole cloth which were independent of those measures. To be sure, most of these theories had considerable thought put into them, but that is not the same thing as an empirically driven understanding in which a theory is induced from specific experimental observation. But this wealth of intellect and thought in those theories was no different in form to those same theories applied in the religious context which we came to know as theology. Neither of them were theories born of empirical observation. Thus, to the surprise of many today, just as it was a surprise to the adherents prior to the 14th Century, neither religion nor political ideology is necessary to understand our world. Today, our capacity to empirically measure social indicators and effects is robust but our mental paradigm is still stuck in the past on theories of political ideology which were either devised hundreds of years ago or have antecedents in the same. We can trace most western political ideology today to the Enlightenment to such philosophers as Aristotle and Plato. Certainly, more has been added since, but the basic methodology hasn’t changed. What we see today in political ideology is a mélange of ideas from various philosophers of the past, ideas that were born whole cloth long before the empirical ability to measure sociological factors existed. In the same way that religion fueled ignorance and superstition, political ideology relegates our understanding of how to manage society to a simplex method devoid of any true empirical foundation. Of course, we can argue semantics and call this view itself political ideology. But there is a fundamental distinction here that renders that view suspicious: this new understanding of how to manage society is a duplex method based on a direct appeal to empirical observation of society upon which law and economics should operate. In this enlightened view we attempt to inform our understanding of law and economics on means and methods consistent with the scientific method. It will not be complete for sure, since our capacity to measure sociological features of our universe are limited, but the change is a paradigm shift.

Small, weak steps in this direction have been evidenced by academics who attempt to understand the effects of policy on society but the universal failing in all of them is the mere fact that political ideology as a thought paradigm exists and is pervasive (normative) in the same way that religion was before the 14th Century. Thus any academic, for example, who tries to empirically observe the universe in a world dominated by religious thought will encounter the same normative confirmation bias and the results will reflect that bias. They will “see” their data through the lens of religion and will inexorably apply a religious world-view to it. That’s why it is called “normative”. Thus, what humanity needs is a Reformation of thought, a Reformation of Enlightenment that removes this confusion so that those living today can realize, first and foremost, that political ideology isn’t needed to understand how to manage human society. As it stands today, most cannot even see that obvious fact and falsely believe that political ideology is somehow a necessary construct. But as we saw with religion, this is not true and is believed only because the “thought system” of political ideology is normative in the present. This means that the very basis of governance, the systems of government that exist today, need reform and the best way to do that is to do what adherents did to reform religion: the role of the priest class must be diminished. In the same way, the role of the political class in our systems of government must be diminished. This can be done by increasing direct public participation in governance. While direct democracies do not work, we need not accept the ad reductio absurdum for progress. What we need is a system in which an elected political class determines public policy by consultation and participation of the public directly, in much the same way that Western Courts rely on juries and judges to operate together to render verdicts. In Western Courts, judges render juries competent, but juries provide the “conscience” of the Court. Judges provide uniform (hopefully) jury instructions and other rules of operation by which the jury operates. The jury provides the conscience of peers. In a similar way, statute and public policy should be created. This is, indeed, the future of neo-liberal western democracy. I’ve spoken of this in other contexts where I mention how technological agency is making this Reformation all the more urgent, but this article explains more of the background to this reasoning and the “why” behind what is happening.

So, what does all this have to do with the death of mainstream journalism? What the death of mainstream journalism is revealing is that technological agency (in this case the information age) is forcing this Reformation upon us and we need to be informed as to the “why” behind it so that we can make wise choices to guide this Reformation, lest the “Reformation” lead to chaos. The only way to control this Reformation being forced upon us is to increase direct public participation in governance, and we must do it as quickly as possible. Unlike the 14th Century, today time cycles move quickly and we don’t have two hundred years to sort this out.

Finally, I can prescribe my opinion on what mainstream journalism can do to save itself from irrelevance and destruction. In the past, with the normative confirmation bias of political ideology holding sway, journalists attempted to report on the “who, what, when, why and where”, the five W’s. But because of normative confirmation bias they tended to try to “go a little deeper” and report the “true” news by identifying the 5 W’s of a given story, which I’ll call an event, in order to discover the “true” event narrative. This meant that some if not a lot of effort was spent in trying to discount or “disprove” any other competing event narrative. In other words, each story had any number of event narratives which could be applied to it and journalists attempted to “discern” which narrative was the “true” narrative. But the reality is that, in stories that inherently involve subjective matter it isn’t possible to always do this. Only by applying a confirmation bias could the “true” narrative be “found”. One exception to this, in a pure sense, would be matters of pure scientific fact. But rarely does a story involve only pure scientific fact. In almost all stories reported, there is some element of subjectivity either in the core of the story or in the details surrounding it, even when a core scientific fact can be established. In other words, seldom if ever do scientific facts stand alone as the only issue at hand in a story. As an example, we can take anthropogenic climate change as a case study in which a core scientific fact can presumably be established by a journalist. But the story of anthropogenic climate change involves more than just that core scientific fact. In other words, there are competing narratives of “truth”, each with its own set of the 5 Ws. In this case, the “what” is the veracity of the scientific consensus. What journalist do today is they try to validate one of these narratives by applying normalized confirmation bias; usually political ideology, and thereby reject competing narratives, usually by not reporting on them at all, or incompletely. The role of a reporter is to report on all the narratives by providing a complete accounting of the 5 Ws of each narrative, rather than trying promote one narrative or the other. There is no need to do this. There is no need for political ideology, or any other form of normative confirmation bias. The obvious objection to this is that it is an idealized way of understanding reporting since we know that a “true” narrative, at least sometimes, does exist. At least a true “what” exists.

And it is this dissonance between the ideal and the practical that is the cause of the decline of mainstream journalism.

Let me explain. In the information age the consumer has the option of patronizing those outlets that comport with their own confirmation bias. Therefore, in an unrestricted society of free speech the consumer will simply ignore any “mainstream” outlet that “chooses” an event narrative inconsistent with their own confirmation bias. Ultimately, this will result in the evaporation and disappearance of any mainstream outlet since all surviving outlets will be outlets of personalized confirmation bias, choosing event narratives for each story according to the presumed confirmation bias of their consumer. The effect of this is ignorance and superstition, to put it bluntly. This is the result of genuine choice for the consumer. Thus, the only way for a mainstream outlet to avoid self-destruction is if they take the ideal tact; that is, they must report on all conceivable event narratives for each story in complete form, providing the 5 Ws for each. By virtue of the fact that they are mainstream, most consumers will stop there because, in the midst of that report, they will apply their own confirmation bias to choose their own event narrative from those provided.

The advantage is that the mainstream outlets are still seen as valuable by the consumer because, if the narrative that comports with one’s own confirmation bias is presented, the consumer does not associate the other narratives with the mainstream outlet itself. Otherwise, in the presence of choice, the consumer will simply flee and ignore the mainstream outlet (and find it untrustworthy).

In order to ensure this outcome, mainstream outlets must provide all the 5 Ws of each conceivable narrative in factual depth on each W with opinion avoided as much as possible. Unless and until normalized confirmation bias is expunged from human society, this is the best one can do. But in terms of the bottom line of mainstream outlets and their advertising revenue, this is a total coup. Almost all consumers will consume the mainstream news at least when first learning of a story, and the majority of traffic will be satisfied with that report. But how do I know this? Empirical observation. The rapid growth of alternative news on youtube demonstrates my point. The most popular alternative outlets are, surprise, factually shallow and mostly cheerleaders of a particular political ideology. This is because they are specialized toward a particular confirmation bias and must be shallow to garner wide consumption. The factual depth they provide is adequate for most consumers and comparable to what is offered in the mainstream outlets now (at least for the one narrative each presents). If the mainstream outlets spent more time on factual depth and broader narrative reporting, they could report in the same space they always have by removing the opinion and rather transparent attempts to dance around inconvenient narratives. One of the key ways the mainstream outlets expose this dancing is how most mainstream stories obviously are leaving out details (sometimes by increasing verbiage!) because the stories make little sense without them. It has the motif of a meme or sound bite and the consumer senses something is missing. This same thing was sensed by citizens of the Soviet Union for 70 years. And that’s another reason why people in the West today go to alternative outlets.

But won’t bias always exist? Of course it will, to include many forms of bias beyond simple confirmation bias. The distinction made here is in how I am, for the purposes of this discussion, defining normalized and episodic bias. A bias (or bigotry) is normalized when the recognized authorities of that culture promote that bias by action or inaction. A bias is episodic when an individual bears it.

The key change created by the Protestant Reformation was that the role of clergy was diminished. Over time this had the effect of reducing the normative confirmation bias that existed in western culture due to religion, ultimately rendering it episodic vice normalized. The same thing will happen to the normative confirmation bias created by political ideology once the role of the political class is diminished. For the activist, the best strategy here is to promote greater participation of the public in governance in the manner described here rather than getting into the weeds of the rationale that accompanies it. The reason for that is that the very same normative confirmation bias of political ideology will work against the goal. But the notion of greater public participation in governance is an idea ripe in our time and should be nurtured.

- kk

P.S. For more info on what we mean by reducing the role of the political class and how direct democracy can be made feasible, see the intro (click) to General Federalism.

Hi all,

Today, 14 October 2014, we read of a report by WHO which states that by 1 December we could see as many as 10,000 deaths from Ebola per week. Oddly, the mainstream press finds this newsworthy, despite the fact that this very same mathematical statement was made by WHO and CDC weeks ago when the same numbers were provided in a different form. We are told that as of today, there are 4500 dead and 9000 infected. Going off of the average reported error factor of about 2, this means that there are more likely 9000 deaths and 18,000 infected as of now. There is a disturbing problem going on here which begs comment in the harshest terms. Magical thinking is undermining the ability not just of the public, but of the authorities, to comprehend this disaster on account of three salient features now evidenced incontrovertibly by the statements and actions such as the one just noted:

  1. Mathematical illiteracy
  2. Inability to assess the assumptions of the math
  3. Lack of realism; or inability to reduce the general to the specific

What about log base 2 of 18000 can some people not understand? And where people do get the significance of exponential growth, what about the core assumptions is not clear? The only assumptions behind this math are pretty dry:

  1. Nature will not do something incredibly improbable in the next 12 months to favorably alter the math.
  2. There is no infrastructure on Earth that can locate, extract and “isolate” 70% of a population of 36,000 in sub-Saharan Africa scattered in three different jurisdictions over uncounted thousands of square miles of African city and jungle. (I’ll explain that number shortly). And certainly 4000 Marines can’t do it.

WHO and CDC are counting on the magical thinking of isolating 70% of the infected population, based on the idea that the reproduction rate of the virus could be rendered linear under those conditions. But implicit in this thinking is the requirement that all those persons be identified and extracted from the general population, and then isolated not by merely housing and feeding them, but by bio-containment isolation. And if we have 18,000 infected now, give or take a few thousand if it pleases you – it doesn’t much matter – then no infrastructure having a viable mathematical impact on this situation exists now and will not likely exist within the doubling time of the viral spread. Therefore, let’s not be silly and let’s assume 2*18,000=36,000. It certainly isn’t going to happen with 2000 beds provided by the U.S. military.

I’ve been trying to explain this now for over two months (in various forms) and am convinced no one comprehends a thing I’m saying.

Let me be clear, the assumptions given above are highly likely to hold and the logarithmic relation given will dominate the outcome. What part of that sentence is confusing? Do I really need to work this logarithm to demonstrate what highly likely means? Try this:

log2(7*109) – log2(36000) = about 17.5 months

I’m assuming that the doubling is every 4 weeks, which is generous, and I’m assuming the world’s population is at least 7 billion. Yes, the doubling rate will increase as the total number of infected increases, but I’ll ignore that for the moment.

We have about 18 months before there won’t be anyone left to discuss this because the mortality rate WHO just released is another failure to reduce the general to the specific; namely, that when people start dying in large numbers the death rate will approach 90% or greater consequent to the sheer anarchy and chaos that will result from the viral mortality of 70%, not to mention the lack of food for anyone remaining.

Let me be clear one more time:

This reduces the problem to two options. We can do one or both. We can develop a real bang up vaccine really, really fast and/or we can isolate populations with force. Of course, the West can relax since their superior health care systems might prevent the assumed “seeding” numbers in the thousands, like we have now in West Africa. In that case, the West can expect to be alone when the dust settles … and to be a little hungry. Hope they’re happy with that. But no, there is no need to panic in the West because clusters will indeed likely be snuffed out. But I hope we can see why …

That doesn’t much *&^% matter.

And speaking of the West, CDC’s recommendations to hospitals are a circus of failures to reduce the general to the specific; a process otherwise known as deduction. “Meticulous” guidelines cannot be followed stochastically in a general hospital environment when those hospitals are using BSL-2. Someone in USG with a brain needs to implement BSL-4 in regional hospitals … right now.

I think nature is about to “inform” us as to just how dumb we really are and future observers will quite likely regard many actions already taken or omitted as criminal negligence of the highest order.

Over the past couple of years I’ve been trying to get the message out that religion and political ideology are vehicles of misplaced emotion that undermine IQ and are squeezing humanity into destruction in a death spiral of ignorance and superstition. I’m afraid my message won’t be heard until billions die.

- kk

EbolaClinicThe Ebola community clinic suggested by WHO and posted on the Washington Post website. U.S. should immediately begin production of self-contained, modular “kits” of this design on a war footing now. They should plan on constructing thousands. Provisions to provide armed security using U.S. military personnel at each of these clinics should be added to this design.

The following is a thought experiment with updates and it should be taken seriously, but not as an inevitability either. Nature can upset these rates of viral diffusion in any number of ways and there are too many variables to know if this estimate is valid. This is just my opinion, and the suggestions are offered as such.

If the number of infected persons should exceed 100,000, then it is unlikely that any international effort will have sufficient resources to make any mathematically non-negligible difference in the rate of diffusion of the virus. At that time, the only means of stopping the diffusion of the virus will by external quarantine.

Will we see that number? If we assume that the total number of infected is three times the estimated amount, as some close to the ground have stated, and the reported figure is 5347 as of 18 September, then the likely more accurate count is 3*5347=16041 persons as of 18 September. Therefore, 100,000 infected persons is notionally reached on the x’th doubling:

105 = 16041*2x

Solving for x and multiplying by the observed doubling time we get:

[log2 (105) - log2 (16041)] * 3 = 7.92 weeks

Defeating this trend with 3000 military personnel and several tons of equipment is improbable in that time. In fact, making any appreciable difference in this rate is improbable when we consider the fact that it will take at least 30 days for this effort to fully stand up. If we cannot defeat this function in this time, then the point in time that the probability of containing the virus is maximum is when that point is reached; that is, in 7.92 weeks. USG should be prepared for that time, which is about 18 November.

Here’s why.

At this time USG should (and must if it is to succeed as pointed out) shift to a quarantine solution. At this time, the total number of infected is 100,000. While this may be too many to contain the virus within the population, it is small enough to contain it within the borders of the three affected countries, assuming steps are taken to do so. But simply putting troops on the border may not be enough. A no-fly zone will be required and all arteries of passage outside these countries must be involuntarily (but temporarily) evacuated with a radius of not less than 15 miles from the point where the artery intersects the border (the actual distance would vary depending on the size of the passage and the topology). Refugees should be sent to either Liberia, Sierra Leone or Guinea. And authorities will have to “guard” this area and allow no entry. This is the only solution in the absence of a vaccine. USG should begin now to collaborate with all neighboring countries to give U.S. military personnel access to these “choke points”. USN must enforce a no-sail zone off the coasts of Liberia and Sierra Leone on 18 November.

If these numbers should attain by 18 November USG should initiate stop-loss on all services and respond with a strategic military effort, which may involve the deployment of many tens of thousands of troops. U.S. military forces, if the numbers exceed 3,000, should be issued all chemical, biological and radiological gear (and I do mean ALL three types) to all personnel. Other nations that supply military personnel should do likewise. Prioritization of vaccination, wherever available, should be ranked thusly:

1. Military personnel and health professionals in the hot zone

2. Gibraltar, Sinai and Panama (my concern is that governments will continue to underestimate the scale of human migratory flow, and how this will cause a wave of successive virus infection flow along migration paths, which will converge on these “ambush sites”; in the sense that these sites, if treated wisely, could serve as points to “ambush” the virus).

3. Populations at the periphery of the hot zone

4. The hot zone

5. The global population, en masse.

If the number of infected increases the probability of containment will fall. When the number of infected in the affected countries reaches 1 million, containment will likely fail in any case. 1 million infected will be reached on:

[log2 (106) - log2 (16041)] * 3 about 4.5 months, or January 15.

At that time, should that occur, USG should apply the same tactic at Gibraltar and the Sinai and the no-fly and no-sail zone should be extended to the continent of Africa. If a vaccine is available, a mass and heavy vaccination of the population in the areas of Gibraltar and Sinai should be performed in a buffer area extending about 100 miles on either side of this boundary. The vaccine should be made available to any person seeking it, regardless of origin and indeed, it should be mandatory (if a refugee arrives in a heavily vaccinated area they are less likely to attempt to continue their exodus). If a vaccine is not available in that quantity, the same area should be evacuated. In either case, the area will require U.S. troops to deny passage to all persons. They should also deny entry into the vaccinated zone. Once this is accomplished, and all available assets should be employed to do so, similar checkpoints should be established at successive national borders from the outbreak. Obviously, if sufficient quantities of a vaccine remain, they should be administered at those locations and to the population generally.

USG should begin collaboration with Spain, Morocco, Israel and Egypt now. As a precautionary measure, USG should do the same with Panama and begin plans to set up a similar boundary in Panama. Plans to establish a buffer in Panama from Atlantic to Pacific by evacuation should begin now. If the efforts to contain the outbreak on the continent fail the virus will have run its course by:

[log2 (109) - log2 (16041)] * 3 about 12 months, or about September 15, 2015.

Resulting in the loss of very roughly 500 million human lives.

If containment to the African continent fails, the virus will run its course and preferentially impact countries that are not well developed or which have large, poor populations. In that case, the virus will run its course in roughly:

[log2 (7*109) - log2 (16041)] * 3 about 14 months, or about November 15, 2015.

Resulting in the loss of very roughly 2.5 billion human lives.

If the (by then) pandemic escapes the African continent, China and especially India are at grave risk because of their large, poor populations confined to a single legal jurisdiction. China will likely not allow foreign assistance on a large scale, though India might. The problem that is being overlooked here is that once the virus has a large pool of infected the ability to contain it drops sharply. In reality, I expect the 2.5 billion figure to be higher. We can expect geopolitical destabilization to occur in this case, with war and conflict becoming a salient feature. The worst case scenario, though not a likely one, is for some authorities such as those in China to use a “snake and nape” tactic. I would caution any such nation (think Pakistan and China) that use of nuclear weapons will only exacerbate the problem as the radiological effect will be as bad or worse than the virus itself. The problem here is trying to guess what a developing nation will do when desperate.

There is no way to know how this function will behave in the future and its doubling rate may change, resulting in large differences in these estimates. However, we cannot ignore the fact that the observed doubling rate is the best information we have for projecting the diffusion of the virus. Another reason to be somewhat skeptical of our starting numbers is the fact that we don’t know if the doubling time is an artifact of the reporting fidelity or a true representation of the rate of diffusion of the virus. However, it is probably imprudent to assume the former.

USG and NIH should place GSK on a war footing now, by the manner of imminent domain or national security if necessary. This should be extended to any other private competency as well, if identified. The likelihood of this virus taking a good hold in any industrialized, wealthy nation is very low, but these events could have cataclysmic economic consequences on the entire world nonetheless.

This entire analysis assumes that no mutation rendering transmission airborne occurs. All suggestions provided are made on the premise of minimizing loss of human life.

- kk

The following is a thought experiment and it should be taken seriously, but not as an inevitability either. Nature can upset these rates of viral diffusion in any number of ways and there are too many variables to know if this estimate is valid.

If the number of infected persons should exceed 100,000, then it is unlikely that any international effort will have sufficient resources to make any mathematically non-negligible difference in the rate of diffusion of the virus. At that time, the only means of stopping the diffusion of the virus will by external quarantine.

Will we see that number? If we assume that the total number of infected is three times the estimated amount, as some close to the ground have stated, and the reported figure is 5347 as of 18 September, then the likely more accurate count is 3*5347=16041 persons as of 18 September. Therefore, 100,000 infected persons is notionally reached on the x’th doubling:

105 = 16041*2x

Solving for x and multiplying by the observed doubling time we get:

[log2 (105) - log2 (16041)] * 3 = 7.92 weeks

Defeating this trend with 3000 military personnel and several tons of equipment is improbable in that time. In fact, making any appreciable difference in this rate is improbable when we consider the fact that it will take at least 30 days for this effort to fully stand up. If we cannot defeat this function in this time, then the point in time that the probability of containing the virus is maximum is when that point is reached; that is, in 7.92 weeks. USG should be prepared for that time, which is about 18 November.

Here’s why.

At this time USG should (and must if it is to succeed as pointed out) shift to a quarantine solution. At this time, the total number of infected is 100,000. While this may be too many to contain the virus within the population, it is small enough to contain it within the borders of the three affected countries, assuming steps are taken to do so. But simply putting troops on the border may not be enough. A no-fly zone will be required and all arteries of passage outside these countries must be involuntarily (but temporarily) evacuated with a radius of not less than 15 miles from the point where the artery intersects the border (the actual distance would vary depending on the size of the passage and the topology). Refugees should be sent to either Liberia, Sierra Leone or Guinea. And authorities will have to “guard” this area and allow no entry. This is the only solution in the absence of a vaccine. USG should begin now to collaborate with all neighboring countries to give U.S. military personnel access to these “choke points”. USN must enforce a no-sail zone off the coasts of Liberia and Sierra Leone on 18 November.

U.S. military forces, if the numbers exceed 3,000, should be issued all chemical, biological and radiological gear (and I do mean ALL three types) to all personnel. Other nations that supply military personnel should do likewise. Prioritization of vaccination, wherever available, should be ranked thusly:

1. Military personnel and health professionals in the hot zone

2. Gibraltar, Sinai and Panama (my concern is that governments will continue to underestimate the scale of human migratory flow, and how this will cause a wave of successive virus infection flow along migration paths, which will converge on these “ambush sites”; in the sense that these sites, if treated wisely, could serve as points to “ambush” the virus).

3. Populations at the periphery of the hot zone

4. The hot zone

5. The global population, en masse.

If the number of infected increases the probability of containment will fall. When the number of infected in the affected countries reaches 1 million, containment will likely fail in any case. 1 million infected will be reached on:

[log2 (106) - log2 (16041)] * 3 about 4.5 months, or January 15.

At that time, should that occur, USG should apply the same tactic at Gibraltar and the Sinai and the no-fly and no-sail zone should be extended to the continent of Africa. If a vaccine is available, a mass and heavy vaccination of the population in the areas of Gibraltar and Sinai should be performed in a buffer area extending about 100 miles on either side of this boundary. The vaccine should be made available to any person seeking it, regardless of origin and indeed, it should be mandatory (if a refugee arrives in a heavily vaccinated area they are less likely to attempt to continue their exodus). If a vaccine is not available in that quantity, the same area should be evacuated. In either case, the area will require U.S. troops to deny passage to all persons. They should also deny entry into the vaccinated zone. Once this is accomplished, and all available assets should be employed to do so, similar checkpoints should be established at successive national borders from the outbreak. Obviously, if sufficient quantities of a vaccine remain, they should be administered at those locations and to the population generally.

USG should begin collaboration with Spain, Morocco, Israel and Egypt now. As a precautionary measure, USG should do the same with Panama and begin plans to set up a similar boundary in Panama. Plans to establish a buffer in Panama from Atlantic to Pacific by evacuation should begin now. If the efforts to contain the outbreak on the continent fail the virus will have run its course by:

[log2 (109) - log2 (16041)] * 3 about 12 months, or about September 15, 2015.

Resulting in the loss of very roughly 500 million human lives.

If containment to the African continent fails, the virus will run its course and preferentially impact countries that are not well developed or which have large, poor populations. In that case, the virus will run its course in roughly:

[log2 (7*109) - log2 (16041)] * 3 about 14 months, or about November 15, 2015.

Resulting in the loss of very roughly 2.5 billion human lives.

If the (by then) pandemic escapes the African continent, China and especially India are at grave risk because of their large, poor populations confined to a single legal jurisdiction. China will likely not allow foreign assistance on a large scale, though India might. The problem that is being overlooked here is that once the virus has a large pool of infected the ability to contain it drops sharply. In reality, I expect the 2.5 billion figure to be higher. We can expect geopolitical destabilization to occur in this case, with war and conflict becoming a salient feature. The worst case scenario, though not a likely one, is for some authorities such as those in China to use a “snake and nape” tactic. I would caution any such nation (think Pakistan and China) that use of nuclear weapons will only exacerbate the problem as the radiological effect will be as bad or worse than the virus itself. The problem here is trying to guess what a developing nation will do when desperate.

There is no way to know how this function will behave in the future and its doubling rate may change, resulting in large differences in these estimates. However, we cannot ignore the fact that the observed doubling rate is the best information we have for projecting the diffusion of the virus. Another reason to be somewhat skeptical of our starting numbers is the fact that we don’t know if the doubling time is an artifact of the reporting fidelity or a true representation of the rate of diffusion of the virus. However, it is probably imprudent to assume the former.

USG and NIH should place GSK on a war footing now, by the manner of imminent domain or national security if necessary. This should be extended to any other private competency as well, if identified. The likelihood of this virus taking a good hold in any industrialized, wealthy nation is very low, but these events could have cataclysmic economic consequences on the entire world nonetheless.

This entire analysis assumes that no mutation rendering transmission airborne occurs. All suggestions provided are made on the premise of minimizing loss of human life.

- kk

Hi all,

I’ve added more detail to the framework which attempts to outline the framework with its necessary provisions for implementation in a slightly more tangible form. The ideas derive of what is known as “general federalism”, but those tenets are limited as much as possible at this high level.

Perhaps the most important first step in a top-down re-evaluation of global governance should begin by identifying the sine qua non of the global or regional environment necessary for a successful and durable global rule of law to exist. And most of that will hinge on normative beliefs, customs and practices within a given society. This is almost, but not identically, akin to stating that a cultural environment conducive to global rule of law must exist first. And, as it stands, I will argue, this is in fact the key impediment to effective multi-lateralism. Humanity must grow up, change and dispense with beliefs and behaviors that, while they may have an antecedent in our biological and social past, can no longer enjoy scientific support for their continued usefulness.

One of humanity’s greatest foibles is our tendency to inject emotion into intellectual inquiry, and the tendency this has to marginalize and exclude reason. Many today blame this on religion or some other boogey man. Certainly, religion provides a feeding ground for uncontrolled emotion. But the truth is that a more fundamental and universal cause presents itself as misplaced emotion. All of the points outlined below deal directly with this issue and provide a way for humanity to address serious, global issues rationally. It represents an executive summary of what this author has been working on for several years now and a full treatment and justification can be found in later works to be shared.

The most fundamental changes needed can be summarized below:

Matter and Energy; an evolutionary step in our understanding of economic theory such that we delineate the most fundamental factors affecting economies. The most fundamental foundation of an economy lies in how we manage matter and energy. Economic “theory” merely rides on top of this fundamental, limiting fact. For any beings of technological agency, consumption of matter and energy is likely the gravest long-term threat to survival. Yes, this is a universal claim. Today we call this sustainability, but sustainability at such a fundamental level as what we are describing here finds a nexus in economic prosperity as well. They are the same thing; most people just don’t realize it yet. The prevailing myth in our time has us believe that it is a one-way street: prosperity depends on sustainability. The truth is that both depend on each other. So, wherever there is technological agency, consumption of matter and energy will increase with time. Therefore, long-term planning should focus on increasing access to matter and energy. Currently, this is sharply limited because we do not have the means to create an actuarially sound space transportation infrastructure. This author’s primary area of interest and effort lies in work that will grossly antiquate all existing space flight technology and make this an economic reality. We will see more about this in the next 2 to 4 years as this author matures his current work, now being done outside of the public radar. It will be known later in the form of a non-profit named the Organization for Space Enterprise. The reason why space is the focus is lengthy, but our current approach of trying to hold fast to an existing form of matter (such as petroleum) or to transition to a new form of matter (periodic elements used in solar panels, for example) is not scalable. It will ultimately destroy humanity (by a gradual starvation of matter and energy) and the only long-term solution is to source matter and energy in quantities vastly larger than what is available on Earth alone. Because of the time frames involved, this effort must begin now.  This will require nimble, systemic change in the underpinnings of the free market. A clever solution is an optimization that “does no damage” to the existing system but affords more directed use of matter and energy, and this author has a proposal. Whatever this author does, USG would be well-advised to invest heavily in the development of the means and methods (not all of which involves new technologies) required to render space flight economically viable and actuarially sound.

  1. Systemic change, at the level of fundamental law, must be constructed to provide both representation and participation in decisions regarding how matter and energy, at its initial source, will be tasked within a free market.
  2. This change cannot undermine the principles of free market economics because it must “do no harm” to systems of demonstrated past performance. Therefore, the scope of this input should be limited to the incentives the public en masse is willing to provide to the private sector to encourage the survey, extraction and refinement of matter and energy on Earth and elsewhere. And such incentive should be constrained by fundamental law only to matter and energy at its source (survey, extraction and refinement; SER) with any additional powers explicitly denied. This I’ve denominated the “Public Trust” which establishes all matter and energy as public property legally owned by an irrevocable trust. This element is advised but not essential. The key concern is that no government entity should be legally entitled to ownership of matter and energy used by the private sector. The public owns it collectively by legal Trust, but the private sector is legally entitled to use it. Ownership does not transfer from private to public for existing matter and energy, but new finds are absorbed into public ownership with legal protections for private entities that seek to utilize and market it.
  3. Considerations of sustainability in this scheme should be addressed separately in Statute by direct representation and participation. The fundamental factors of merit should be codified as a balance of immediate prosperity and long-term impact (on nature and its impact on future accessibility to matter and energy).
  4. The Courts of a general federation should operate only where a party’s inference in a Court of the Federation shall not augment less the evidence submitted in support bears substantial probative force by the manner of procedures consistent with the scientific method.

Social Justice; the evolutionary step in our normative understanding of social justice. We need to transform the public understanding of social justice to inhere the more that social justice should be blind to personality and demographic and should rather focus on behaviors of merit and those that lack merit. The old saying that violence begets violence likewise extends to the notion that emotional extremism begets emotional extremism. Almost all notions of social justice today rely on emotional domination of the issues and feed off of ancient and barbaric fears that do nothing but generate a vicious cycle of repeated but varying “causes” through history. The result is that throughout history we see a pattern of social justice never materializing generally throughout society, with one cause giving rise to another in multi-regional and multi-temporal cycle that has been going on for at least 1000 years. This is difficult to see in our immediate present because these patterns take considerable time to cycle and may occur in disparate geographies. At the base of this cycle we see the exclusion of reason in discourse on account of the emotion so naturally abundant in matters of social justice. While emotion has a legitimate place and time, if humanity is to prosper, we must learn how to separate emotion from issues requiring reason to solve. Due to vested interests in the current norm of emotionally-driven understandings of social justice, this is a grave threat to the future of humanity. This will require nimble, systemic change advanced mostly through cultural efforts.

  1. It should be established as a matter of fundamental law that any and all sumptuary law that cannot sustain scientific scrutiny shall not be law or equity within the jurisdiction of the Federation.
  2. It should be established that any Statute or equity in the Federation which shall be reasonably expected to influence a matter of social justice, however broad, shall be applied to all human behavior uniformly and predictably to all persons without regard to personality or demographic, less it shall not stand as law or equity in the Federation. This provision would extend to enforcement as well. Ironically, this issue is solved by simply restating a key premise of rule of law itself: uniformity and predictability.

The Political Class and public distrust: Lack of participation and therefore some semblance of control, whether a good thing or not, evokes fear. Fear undermines trust. The solution is to find a reduction of the scope and scale of the political class such that representation and participation of the public is dramatically enhanced. Direct democracies simply do not work, therefore, a totally novel and new understanding of how to merge a political class with a more direct form of participation is urgently needed. This author has a proposal. The future of neo-liberal idealism is the evolution beyond representation alone and more into the area of direct participation. A clever means of rendering that participation competent via a political class is key to this solution involving an Assembly (analogous to a jury) and a Federation Civil Corps of citizens. As organic power decentralizes via technological agency, the duty to participate will quickly transform from nuisance to demand. The key is not to view this as an elimination of the political class, but as a “force multiplier” of the same, permitting the political class to take on a more focused role centering on providing competence to govern. Additional mechanisms within the participatory role of the public are needed to dilute incompetence and enhance representation. This will require nimble, systemic change.

  1. The analogy given here to western law and courts is somewhat sloppy. In the case of an Assembly, their role is the consideration of statute, not equity. Equity should belong solely to the courts of the Federation.
  2. Competence is provided by a panel of elected officials (a political class) analogous to a panel of judges with the privilege of writing statute, making first motion to vote and other provisions too lengthy to get into here.
  3. Statute is “algorithm friendly” allowing votes of very large numbers of persons randomly called to duty by a double-blind process to occur in seconds.
  4. Negotiation, resolution and consultation for making statute is performed by a Federation Civil Corps, consisting of lawyers, economists and other experts. It shall be a strictly merit-based system. Their duty is to inform and educate the Assembly and provide communication and consultation capacity between the elected officials and the Assembly.
  5. Assemblies are called every 12 months, consisting of a unique selection of Citizens of the Federation at large. It could be either voluntary or a lawful duty (I suggest that it be a lawful duty).
  6. Numbers of Assembly members are sufficient to allow representation of one-tenth of all Citizens of the Federation once every 100 years.

Organic power structure: Organic power structures in any society of technological agency will tend to decentralize over time and organic power in the future will more likely exist as a force of change coming from the populace in mass. The very meaning of “organic power structure” is shifting beneath our feet, and victory will go to those that see it. It is important to warn future generations that this is a consequence of technological change itself and not an objective or goal. We must prepare for this, and it is a key reason for the need to re-frame our normative understanding of social justice (but it must be done for all matters of social justice in order to ensure that a durable norm is the product). Class differences cannot be resolved if justice for one means injustice for another, regardless of the relative differences. This author has a solution that will ensure justice for all, which includes a mechanism that does not rely on schemes of income redistribution or the denial of social reward already accrued through lawful and legitimate means. This transition will occur over many generations no matter what is done, but this author’s solution provides a structured way to do this without injustice and general social collapse; and under a durable framework of governance. The key finding here is that organic power is evolving into something never seen before: organic power has throughout all of human history derived of relatively large differences in wealth but is now, for the first time, evidencing a pattern of change toward balance of power derived of wealth and power derived of technological agency. To remain durable, a responsible government must take these forces of influence into account. This will require nimble, systemic change:

  1. This historical evolution is accommodated in full by the process outlined regarding participatory governance.
  2. It should be a matter of substantive fundamental law that no person may be dispossessed of property without due process of law, which fundamental law should inform as not ponderable by any court of the Federation less imminent domain for the greater good is well established and fair market value compensation is afforded.
  3. It should be a matter of substantive fundamental law that the right to private property shall not be infringed.
  4. It should be a matter of substantive fundamental law that the right to seek redress for violations of substantive fundamental law shall not be infringed; however, lobbying of the Federation by any entity for any other reason shall be unlawful. This is a key provision of durability and an accounting for a new kind of organic power and should not be overlooked.

Implementation: A General Federation must be extremely flexible over time such that it can begin as a non-profit, then promote to accession by a nation-state. Then over time it must include other nation-states limited in pace to inclusion of states only where the norms cited herein are sufficiently strong to support it. An alliance of states that do not possess these norms will not be durable or effective and is the primary reason why multilateralism has failed. Currently, the only candidates that exist are the United States, Israel, Germany and the UK, and those states will require much preparatory work in nurturing a healthy set of norms as listed here before that can happen. Currently, the United States is number one on the runway, despite its relatively poor advancement in matters of social justice. Additional mechanisms have been devised to also allow scaled accession of developing nations. But it should not be forgotten that while normative practices are necessary, codification in explicit rule of law must come alongside it. Schemes that deny the central necessity of codified, transparent rule of law gathered by consensus will fail. This is the second cause of the failure of multilateralism. Disaggregated states and other schemes that pretend to operate “from the back door” are not durable in time. We don’t need more politicians or technocrats as they are not a solution to the problem, they are in the near future likely to be the problem. And that is because, wherever the scope of the political class expands, the fear increases. In a future world of ever advancing technological agency failure to better balance competence with participation will be disastrous. The public must be enlisted to fulfill this balance and give agency a voice. To be clear, this identifies the much larger, longer-term threat which it encompasses (but includes much more) we today call terrorism, the canonical, most extreme example of this failure mode.

  1. This can be achieved in fundamental law by the inclusion of a provision for a “National Codicil”, too lengthy to describe here.
  2. A National Codicil reduces burdens on member states to allow sunshine provisions for the ramping up of certain Federation sovereignties over a renewable period of fifty years.
  3. It should begin with the United States as its sole member such that the normative values of that institution may be inhered sufficiently before it accedes to a cosmopolitan or pluralistic stage. It does not require any change to U.S. relations with other organizations such as the UN or NATO. That the U.S. be the first state is crucial. The U.S. could begin this process solely pro forma with essentially no commitment to change U.S. policy at the outset, but its long-term effect on inhering these values worldwide would be enormous. It would be the first, tangible example of the idealistic values of global-ready, neo-liberal western democracy and would quickly have countries begging to join. This would put USG in the driver’s seat as far as ensuring those values are present beforehand. It would also give USG a chance to introduce this to the U.S. public and give them, and supporters of the cause, time to digest it and increase support for it. It would also give USG an opportunity to experiment with and tweak the Assembly concept. The answer to global governance is simple, we just need to DO it.

Who can implement this: Such an effort can only be achieved if spearheaded by the leadership of a nation most “advanced” in these norms and whose relative economic power can sustain it. The United States still isn’t there, but it is humanity’s best hope. It’s time to get novel and advance the state of affairs in the management of human society. The clock is ticking. Listen to me now, hear me later.

  1. The system propounded is a Hamiltonian, federal system; that is, wherever statute is enacted for the one State, then for all uniformly. It is a system of subsidiarity. It is a system with a strong executive and which regards economics as within the compass of the social contract. It is a system consisting of four distinct branches; legal, economic, executive and judicial. It is a system contrived to balance the powers of those branches, and to balance the interests of the greater good and the individual. It is a system whereby equity is solely applied to inform the rule of law by the color of the instance, not violate it. The executive and judicial powers are filled by a political class. The legal and economic powers are filled by a political class and their respective Assemblies. Supreme Justices are appointed by the political class for life.

The future will involve many quick pivots as asynchronous events seem to drive us in all directions at once. Multilateralism demands the kind of decisive action only a durable force can provide. A strong federal executive and its lot, constrained by the idealistic, normative values that tame it, is where it’s at. This has been evidenced most recently in the crisis with ISIL and Ebola. One week it was ISIL, the next week it was Ebola. No one invented that. It’s our future, get used to it.

A final, related note is the question of where is Russia, PRC, DPRK et al when Ebola threatens millions of human lives? Yes, they are offering assistance, but no one has acted as assertively as the United States. This is a time-tested pattern. From Fort Sumter to Leyte Gulf, from Ia Drang to Somalia, America has repeatedly shed blood for a greater good. Now, 3000 servicemembers are about to risk their lives once again and thousands of tons of hardware move into harm’s way. It tells us that idealistic normative values coupled with clever fundamental law are the forces of idealism and assertiveness humanity needs. The lack of response elsewhere is not because of a lack of ability. Russia has a fine navy. PRC has a massive army. Criticize the United States if you wish (and there is a time and place for that), but it is a cheap shot that merely jeopardizes humanity’s future. It’s time to get real.

- kk

Hi all,

Perhaps the most important first step in a top-down re-evaluation of global governance should begin by identifying the sine qua non of the global or regional environment necessary for a successful and durable global rule of law to exist. And most of that will hinge on normative beliefs, customs and practices within a given society. This is almost, but not identically, akin to stating that a cultural environment conducive to global rule of law must exist first. And, as it stands, I will argue, this is in fact the key impediment to effective multi-lateralism. Humanity must grow up, change and dispense with beliefs and behaviors that, while they may have an antecedent in our biological and social past, can no longer enjoy scientific support for their continued usefulness.

One of humanity’s greatest foibles is our tendency to inject emotion into intellectual inquiry, and the tendency this has to marginalize and exclude reason. Many today blame this on religion or some other boogey man. Certainly, religion provides a feeding ground for uncontrolled emotion. But the truth is that a more fundamental and universal cause presents itself as misplaced emotion. All of the points outlined below deal directly with this issue and provide a way for humanity to address serious, global issues rationally. It represents an executive summary of what this author has been working on for several years now and a full treatment and justification can be found in later works to be shared.

The most fundamental changes needed can be summarized below:

  1. Matter and Energy; an evolutionary step in our understanding of economic theory such that we delineate the most fundamental factors affecting economies. The most fundamental foundation of an economy lies in how we manage matter and energy. Economic “theory” merely rides on top of this fundamental, limiting fact. For any beings of technological agency, consumption of matter and energy is likely the gravest long-term threat to survival. Yes, this is a universal claim. Today we call this sustainability, but sustainability at such a fundamental level as what we are describing here finds a nexus in economic prosperity as well. They are the same thing; most people just don’t realize it yet. The prevailing myth in our time has us believe that it is a one-way street: prosperity depends on sustainability. The truth is that both depend on each other. So, wherever there is technological agency, consumption of matter and energy will increase with time. Therefore, long-term planning should focus on increasing access to matter and energy. Currently, this is sharply limited because we do not have the means to create an actuarially sound space transportation infrastructure. This author’s primary area of interest and effort lies in work that will grossly antiquate all existing space flight technology and make this an economic reality. We will see more about this in the next 2 to 4 years as this author matures his current work, now being done outside of the public radar. It will be known later in the form of a non-profit named the Organization for Space Enterprise. The reason why space is the focus is lengthy, but our current approach of trying to hold fast to an existing form of matter (such as petroleum) or to transition to a new form of matter (periodic elements used in solar panels, for example) is not scalable. It will ultimately destroy humanity (by a gradual starvation of matter and energy) and the only long-term solution is to source matter and energy in quantities vastly larger than what is available on Earth alone. Because of the time frames involved, this effort must begin now.  This will require nimble, systemic change in the underpinnings of the free market. A clever solution is an optimization that “does no damage” to the existing system but affords more directed use of matter and energy, and this author has a proposal. Whatever this author does, USG would be well-advised to invest heavily in the development of the means and methods (not all of which involves new technologies) required to render space flight economically viable and actuarially sound.
  2. Social Justice; the evolutionary step in our normative understanding of social justice. We need to transform the public understanding of social justice to inhere the more that social justice should be blind to personality and demographic and should rather focus on behaviors of merit and those that lack merit. The old saying that violence begets violence likewise extends to the notion that emotional extremism begets emotional extremism. Almost all notions of social justice today rely on emotional domination of the issues and feed off of ancient and barbaric fears that do nothing but generate a vicious cycle of repeated but varying “causes” through history. The result is that throughout history we see a pattern of social justice never materializing generally throughout society, with one cause giving rise to another in multi-regional and multi-temporal cycle that has been going on for at least 1000 years. This is difficult to see in our immediate present because these patterns take considerable time to cycle and may occur in disparate geographies. At the base of this cycle we see the exclusion of reason in discourse on account of the emotion so naturally abundant in matters of social justice. While emotion has a legitimate place and time, if humanity is to prosper, we must learn how to separate emotion from issues requiring reason to solve. Due to vested interests in the current norm of emotionally-driven understandings of social justice, this is a grave threat to the future of humanity. This will require nimble, systemic change advanced mostly through cultural efforts.
  3. The Political Class and public distrust: Lack of participation and therefore some semblance of control, whether a good thing or not, evokes fear. Fear undermines trust. The solution is to find a reduction of the scope and scale of the political class such that representation and participation of the public is dramatically enhanced. Direct democracies simply do not work, therefore, a totally novel and new understanding of how to merge a political class with a more direct form of participation is urgently needed. This author has a proposal. The future of neo-liberal idealism is the evolution beyond representation alone and more into the area of direct participation. A clever means of rendering that participation competent via a political class is key to this solution involving an Assembly (analogous to a jury) and a Federation Civil Corps of citizens. As organic power decentralizes via technological agency, the duty to participate will quickly transform from nuisance to demand. The key is not to view this as an elimination of the political class, but as a “force multiplier” of the same, permitting the political class to take on a more focused role centering on providing competence to govern. Additional mechanisms within the participatory role of the public are needed to dilute incompetence and enhance representation. This will require nimble, systemic change.
  4. Organic power structure: Organic power structures in any society of technological agency will tend to decentralize over time and organic power in the future will more likely exist as a force of change coming from the populace in mass. The very meaning of “organic power structure” is shifting beneath our feet, and victory will go to those that see it. It is important to warn future generations that this is a consequence of technological change itself and not an objective or goal. We must prepare for this, and it is a key reason for the need to re-frame our normative understanding of social justice (but it must be done for all matters of social justice in order to ensure that a durable norm is the product). Class differences cannot be resolved if justice for one means injustice for another, regardless of the relative differences. This author has a solution that will ensure justice for all, which includes a mechanism that does not rely on schemes of income redistribution or the denial of social reward already accrued through lawful and legitimate means. This transition will occur over many generations no matter what is done, but this author’s solution provides a structured way to do this without injustice and general social collapse; and under a durable framework of governance. The key finding here is that organic power is evolving into something never seen before: organic power has throughout all of human history derived of relatively large differences in wealth but is now, for the first time, evidencing a pattern of change toward balance of power derived of wealth and power derived of technological agency. To remain durable, a responsible government must take these forces of influence into account. This will require nimble, systemic change.
  5. Implementation: A General Federation must be extremely flexible over time such that it can begin as a non-profit, then promote to accession by a nation-state. Then over time it must include other nation-states limited in pace to inclusion of states only where the norms cited herein are sufficiently strong to support it. An alliance of states that do not possess these norms will not be durable or effective and is the primary reason why multilateralism has failed. Currently, the only candidates that exist are the United States, Israel, Germany and the UK, and those states will require much preparatory work in nurturing a healthy set of norms as listed here before that can happen. Currently, the United States is number one on the runway, despite its relatively poor advancement in matters of social justice. Additional mechanisms have been devised to also allow scaled accession of developing nations. But it should not be forgotten that while normative practices are necessary, codification in explicit rule of law must come alongside it. Schemes that deny the central necessity of codified, transparent rule of law gathered by consensus will fail. This is the second cause of the failure of multilateralism. Disaggregated states and other schemes that pretend to operate “from the back door” are not durable in time. We don’t need more politicians or technocrats as they are not a solution to the problem, they are in the near future likely to be the problem. And that is because, wherever the scope of the political class expands, the fear increases. In a future world of ever advancing technological agency failure to better balance competence with participation will be disastrous. The public must be enlisted to fulfill this balance and give agency a voice. To be clear, this identifies the much larger, longer-term threat which it encompasses (but includes much more) we today call terrorism, the canonical, most extreme example of this failure mode.
  6. Who can implement this: Such an effort can only be achieved if spearheaded by the leadership of a nation most “advanced” in these norms and whose relative economic power can sustain it. The United States still isn’t there, but it is humanity’s best hope. It’s time to get novel and advance the state of affairs in the management of human society. The clock is ticking.

Listen to me now, hear me later.

The future will involve many quick pivots as asynchronous events seem to drive us in all directions at once. Multilateralism demands the kind of decisive action only a durable force can provide. A strong federal executive and its lot, constrained by the idealistic, normative values that tame it, is where it’s at. This has been evidenced most recently in the crisis with ISIL and Ebola. One week it was ISIL, the next week it was Ebola. No one invented that. It’s our future, get used to it.

A final, related note is the question of where is Russia, PRC, DPRK et al when Ebola threatens millions of human lives? Yes, they are offering assistance, but no one has acted as assertively as the United States. This is a time-tested pattern. From Fort Sumter to Leyte Gulf, from Ia Drang to Somalia, America has repeatedly shed blood for a greater good. Now, 3000 servicemembers are about to risk their lives once again and thousands of tons of hardware move into harm’s way. It tells us that idealistic normative values coupled with clever fundamental law are the forces of idealism and assertiveness humanity needs. The lack of response elsewhere is not because of a lack of ability. Russia has a fine navy. PRC has a massive army. Criticize the United States if you wish (and there is a time and place for that), but it is a cheap shot that merely jeopardizes humanity’s future. It’s time to get real.

- kk

“A Prank, A Cigarette and a A Gun”

Amanda+Knox+Amanda+Knox+Awaits+Murder+Verdict+hndaoOijPOEl

An article about the murder of Meredith Kercher. I’ve agreed not to say much but if you’re interested in this case this is the most important read of all. The truth of what happened is in here.

Click below:

The So-called “Best Fit Report”

by Sigrun M. Van Houten

What is Best Fit Analysis, a quick intro

Students of statistics and stochastic analysis will recognize the term “Best Fit” as a statistical construct used to determine a most probable graph on a scatter plot. In the same manner when the clandestine services seek to resolve a most probable narrative of events when information about that narrative is limited or inaccessible, one can assess the information that is available to construct a probabilistic scatter plot. Once done, it is then possible to graph a “line” that represents a best fit to those data points. In this case, the data points making up the scatter plot are individual facts or evidence and the associated probability they possess as inference to a larger narrative. And the “line” drawn as a best fit is the most likely narrative of events.

This procedure has parallels in normative notions well understood in western law for some time. Namely, it deals with probative value which, though perhaps not used in the strict sense used in law, is used here as a catch-all to describe each data point. Each data point reflects the probability of a given piece of evidence. But what do we mean by “probability of a given piece of evidence”? In Best Fit analysis (BFA) we begin by constructing a hypothesized narrative. When applied to criminology, the hypothesized narrative usually presents itself fairly easily since it is almost always the “guilt narrative” for a given suspect or suspects in a crime. In this short introduction to BFA, I will show how it can be used in criminology. The advantage in criminology is that rather than having to sort through innumerable hypotheses as is common in the clandestine services, here we have the advantage that we usually have a hypothesis presented to us on account of an accusation or charge. We can then use BFA to test the narrative to see if it is the most likely narrative. With perturbations of the same, we can likely identify alternative narratives more likely to be correct.

Norms of western law are dated in some cases, and some have not been updated for a long, long time. One of those areas apparently is the area of probative value. Typically in Courts of western nations it is presumed that a piece of evidence has “probative value” if it points to a desired inference (which may be a guilt narrative or some more specific component thereof). I’m not an attorney so I can’t categorically state that the concept of a “desired inference” really refers to an overall guilt narrative or simply the odds that the evidence points to a guilt narrative. But what I can say is that in practice it almost always is used in the sense of an overarching narrative or reality.

A case in point is a famous case in which a man was accused of murdering his wife during a divorce. It turned out that his brother had actually committed the crime. But once his brother was convicted an attempt was made to convict the husband of the crime by the accusation that he “contracted” with the brother to commit the crime and end his divorce case favorably. In the second trail of the husband the evidence was almost entirely circumstantial and the jury relied heavily on an increase in phone activity between the husband and his brother leading up to the murder. Normally, the brothers had not spoken on the phone often and there was a clear and obvious sudden increase in the frequency of calls. The jury interpreted this as collusion and convicted the husband of murder. Thus, when brought to Court, the desired inference of testimony and records of phone calls was that collusion existed. This is a piece of evidence being used to point to a guilt narrative. The problem however, was that it was never shown why it should be more likely to have inferred collusion than simply distress over a divorce. It is not unusual for parties in a divorce to reach out to family and suddenly increase their level of communication at such a time. In other words, and on the face of it, one inference was just as likely as the other.

What legal scholars would say is that this is a reductionist argument and fails because it does not take into account the larger “body of evidence”. Unfortunately, this is mathematically illiterate and inconsistent with the proper application of probability. This is because it takes a black and white view of “reduction” and applies it incorrectly, resulting in a circularity condition. The correct answer is that

… One takes a body of evidence and reduces it to a degree sufficient to eliminate circularity and no further.

In other words, it is not all or nothing. In fact, this kind of absolutist understanding of “reductionist argumentation” is precisely what led to the results of the Salem Witch Trials. In those cases, probative value was ascribed based on a pre-existing hypothesis or collection of assumptions; essentially a cooked recipe for enabling confirmation bias either for or against guilt.

To explain what we mean, in the case of the phone calls between brothers, one cannot use a hypothesized narrative (the inference itself) to support the desired inference. This is circularity. But one also cannot reduce the evidence to such a degree that the body of evidence in toto is not weighed upon the merit of its parts. From the perspective of probability theory, this means that we must first determine whether, as an isolated construct, the probability that the phone calls between brothers were for the purpose of collusion must be greater than the probability that the calls were due to emotional distress. And it must be something we can reasonably well know and measure. While we can never apply numerical values to these things, it must at least be an accessible concept. Once we’ve looked at the odds of each of the two possible inferences we can then ask which is more likely. Unless the inference that the calls were for the purpose of collusion is greater than the odds that the calls were for the purpose of emotional support, there can be no probative value (in the sense we are using that term here).

The reason for the “isolation” is that we cannot determine aforesaid odds by using the inference, or the larger narrative, to support those odds because it is the narrative that is the hypothesis itself. Having said that, once we have done this, if we can show that the odds are greater that the calls between brothers were for the purpose of collusion, even if that difference of probability between the two inferences is very small, the phone calls can then be used to assess the likelihood of the guilt narrative by considering it in the context of the body of knowledge. In other words, if we could associate numbers with this analysis as a convenience for illustration, if we have 10 pieces of evidence bearing only, perhaps 5% probability difference favoring the guilt narrative, it might be possible nonetheless to show that the guilt narrative is the most likely narrative. In other words, we consider all evidence, each with its own net odds, in order to frame the odds of the guilt narrative. And we are therefore using reduction only to the extent that it excludes circularity, and no more. And both the number of evidentiary items and the odds of each matter. If we had 3 pieces of evidence each bearing a net probability of 90% favoring a guilt narrative, it might be just as strong as 10 pieces bearing a net probability of only 5%. And it is these odds that must be left to the jury, as it is not a mathematical or strictly legal exercise but an exercise in conscience and odds.

Sadly, it is routine practice in western Courts to employ probative value in such a manner as to establish in the juries thinking a circularity condition whereby the larger narrative of guilt or innocence is used to substantiate the probative value of individual pieces of evidence. The way to control this is for the understanding of probative value to change and modernize, and to require Judges to reject evidence (rule inadmissible) that either does not point in any direction (net odds of 0%) or points in a different direction than the desired inference. This is a judgment call that can only be left to the Judge since to leave that in the hands of the jury effects prejudice by its very existence. While there seems to be lip service to treating probative value as we’ve described, it appears to almost never be followed in practice and most laws and Court regulations permit Judges to use their “discretion” in this matter (which, in practice, amounts to accepting evidence with zero probative value). Standards are needed to constrain the degree of discretion seen in today’s Courts and to render the judgment of Judges in matters of probative value more consistent and reliable. One way to do this is to treat evidence as it is treated under BFA.

While many groups that lobby and advocate against wrongful conviction cite all sorts of reasons for wrongful convictions, tragically they seem to be missing the larger point which is that these underlying, structural and systemic issues surrounding probative value are the true, fundamental cause of wrongful conviction. For without proper filtering of evidence, things like prosecutorial misconduct, bad lab work, etc. find their ways to the jury. It is inevitable. But the minute you mention “structural” or “systemic” problems everyone runs like scared chickens. No one wants to address the need for major overhauls. But any real improvement in justice won’t come until that happens.

Thus, with BFA, in the clandestine context, we take a large data dump of everything we have. Teams go through the evidence to eliminate that which can be shown on its face to be false. Then we examine each piece of evidence for provenance and authenticity, again, only on what can be shown on its face. I’m condensing this process considerably, but that is the essence of the first stage. We then examine in each piece in relation to all advanced hypotheses and assign odds to each. Once done, we look at the entire body of evidence in the last stage to determine which of the narratives (hypotheses) requires the least number of assumptions to make it logically consistent. That one with the least number of assumptions is the Best Fit. If we were to graph it we would see a line running through a scatter plot of probabilistic evidence. That line represents the most likely narrative. On that graph assumptions appear as “naked fact” and are “dots” to be avoided.

To see a good example of how BFA is employed, you can see my work on the Lizzie Andrew Borden, Jon Benet Ramsey, Darlie Routier and Meredith Kercher cases. That this method is remarkably more effective than what we see in police investigations and Courts is well-known by those that have used this technique for at least three decades now. But it has been somewhat outside the radar of the general public because of its origins. My hope is that through public awareness this method can be applied to criminology and jurisprudence resulting in a far greater accuracy rate in the determination of what actually occurs during criminal acts, especially in matters of Capital crimes where the added overhead is well worth it.

~ svh

Notice: I am told that Mozilla users can experience a copy of this report that has sections of text missing. I recommend that Mozilla users download the pdf and view it on their desktop. – kk

Follow

Get every new post delivered to your Inbox.

Join 152 other followers

%d bloggers like this: