Friday 29 September 2017

Distribution Devisal

Would an artificial intelligent program delete its replica to gain priority status? 
Would aliens attack humans for our resources rather than offering trade?
Would Jesus shove someone out of line to get a taco quicker?
Would a wolf hide a dead moose for itself, from its starving pack?

There is a common distribution rationalization between the scenarios, which would suggest the occurrence to be unlikely. An entity (be it the individual itself, or an efficiency output system of trial and error, such as natural selection) capable of effective rationalisation would likely come to the same conclusion of general logic application, from analysis of efficient dividend distribution.

A general applicable rule based on deductive rationalization; all undefined units should be considered of equivalent value for distribution of alterations in possession quantity. From this it can be derived, all units should acquire equivalent additives to keep the units equal, or alternatively reduction in quantity of posessions should be distributed evenly between units. This would be relevant in the scenario where 1 unit gains additives or negatives. The alteration in possession quantity should be applied universally, or evenly distributed to the remaining units in order to retain overall equality and efficiency of the group of units as a whole. Also derived and relative to this rule, is the occurrence where a change to the total sum is intended, where the alteration to all units –whether positive or negative- should be evenly spread.

With enough statistical evidence an entity might rationally deduce that it has greater value than some alternate units –based on practical capabilities etc.-, but it would likely only be to a small degree, assuming the unit is generally similar. It may actually be self-valuing in some scenarios to a small degree, but only in relation to distinctly differing attributes, and the occasion where resources are important enough to distribute disproportionately (as potentially causing risk of conflict).

Now, what in the frick am I talking about? I may sound like some sort of hippy calculator program, and maybe I am in some way –what’s wrong with that?) This is my method of describing something in the most generic form I can envision it, to reduce any biases, mindsets or preconceptions related to the topic which might be in place with a more specific example. So now that the unbiased description has been worked out, I’ll apply it to more specific examples.

An AI program would likely distinguish that a replica of itself (unit) is theoretically of equivalent value as far as its capable of deducing. If the Ai has any sort of achievement target, -with appropriate analysis capabilities- it would evaluate such that additional equivalent units to itself would increase probability of achieving its target in cooperation.

A similarly developed -to humans- alien race would likely realise that attacking humans would result in large net loss, where trading benefits all units with additional value in quantity of possessions.


Jesus would empathise that the guy –for all comprehensive awareness- deserves that taco as quickly as he does (assuming the perspective of an equivalent human), in relativity to time of entering the lineup for tacos. Time as a quantity of possession, should be considered equally distributable between units.

The wolf could keep the dead moose for himself as a long term source of food, but as natural selection would have deduced by long term trial and error of wolves behaviour, sharing the moose with the rest of the starving pack allows them to survive as well, allowing them to work together in the near future to much more effectively catch more prey.

When it comes to devising distribution, the most potentially effective and efficient method seems to be even application of variable acquirements, to all units of perceived equal value. Evenly spread distribution tends to keep the group balanced, and achieve net positive for the lump sum.

No comments:

Post a Comment