Sunday, June 9, 2013

Don’t Let the Economy Stop You from Starting

If you feel the tug to start a business, why wouldn’t you just go do it?  Well, it seems, starting a company is one of those things that people, unfortunately, often talk themselves out of.  So if you are looking for an excuse to ignore the tug, let's consider this one: “The economy just isn’t strong enough, currently; it’s too risky to start a company right now.”  We can even use last week’s job report to validate this excuse; the report said, "the economy could slow in the summer months".

Well, turns out that’s actually a terrible excuse.  To the contrary, companies founded in a weak economy have at least as good a chance of long term survival as those founded during good times.  It is interesting to note that one study found that more than half of the 500 largest companies in the U.S. were started during a recession.
Some aspects of a weak economy can actually help a startup--for example, those in your network that are unemployed will be more likely to take a chance helping you build a company, as they may have few other options.  Office space is more likely to be available at an affordable price.  Meanwhile, potential customers may be short on cash themselves, and be more likely to take a chance on a new company.

Moreover, the same study mentioned above, finds there are reasons to believe that a business started during a downturn might actually benefit in the long term.  It is as if such businesses get an inoculation against recessions.  Businesses that start in the midst of a strong economy are the ones blowing millions on Super Bowl ads, while those started during lean and mean times stay lean and mean.   
Speaking of “lean”, the other reason to get started despite a weak economy is the learning that has happened during the recent recession; specifically about The Lean Startup as espoused by Eric Ries.  As reported by the Harvard Business Review, the lean startup methodology changes everything.  As it relates to a weak economy, the key point is, using this approach, a smaller amount of capital is invested and put at risk early on in a startup’s life.

Here in 2013, we are far from the dark days of 2007, when things looked bleak indeed; but we are also not in the heady days of the late nineties—so the ironic good news is that the economy is still bad enough to help you get a good start.   So get going, and remember this Seth Godin quote:  “The only thing worse than starting something and failing… is not starting something”.

Monday, June 3, 2013

Useful Software Adages (laws?)

It seems to be a natural tendency of humans to simplify life down to some simple rules or laws, and for those of us whose lives revolve around software development, we, of course, strive to simplify software development down to simple adages and maxims. With that in mind, I found this posting: Some "Laws" of Software Development interesting and fun to read (just don't look at the comments; just a bunch of people taking all the fun out of it).

I plan to maintain a list here of those "laws" (adages, maxims) that I have personally witnessed, and for each, include a link to an article that relates or explains the law:

Many of the summaries are cut out of Wikipedia, directly.

Serious Laws
  • Al’s law: "people will remember slipped dates but not slipped features"
  • Amdahl's Law "used in parallel computing to predict the theoretical maximum speedup using multiple processors" (as described in Wikipedia), also see Gustafson's Law (a more practical version of Amdahl's)
  • Ashby's Law: "if a system is to be able to deal successfully with the diversity of challenges that its environment produces, then it needs to have a repertoire of responses which is (at least) as nuanced as the problems thrown up by the environment. So a viable system is one that can handle the variability of its environment. Or, as Ashby put it, only variety can absorb variety."
  • Bell's Law of Classes of Computers - improvements in hardware/software lead to a new "class" of devices each decade.
  • Boyd's Law of Iteration -- Speed of iteration beats the quality of iteration (assuming you are doing meaningful learning during each iteration--failing productively).
  • CAP theorem, also named Brewer's theorem after computer scientist Eric Brewer states that it is impossible for a distributed computer system to simultaneously provide more than two out of three of the following guarantees: consistency, availability, and partition tolerance.
  • Law of Demeter: The fundamental notion is that a given object should assume as little as possible about the structure or properties of anything else (including its subcomponents), in accordance with the principle of "information hiding". 
  • In a distributed system, you can know where the work is done or you can know when the work is done, but you can't know both.—Pat Helland
  • Humphrey’s law: "users do not know what they want until they see working software"
  • Hyrum's Law -  "With a sufficient number of users of an API, it does not matter what you promise in the contract: all observable behaviors of your system will be depended on by somebody."
  • Campbell's Law is the observation that once a metric has been identified as a primary indicator for success, its ability to accurately measure success tends to be compromised (see also Goodhart's Law and Cobra-effect).
  • Conway’s Law:  "a software system will reflect the organizational structure of the team developing the software"
  • Glass's Law"For every 25 percent increase in functionality that vendors add to their devices, there is a 4X multiplying effect in terms of complexity of that system."
  • Goodhart's law "When a measure becomes a target, it ceases to be a good measure."[1] One way in which this can occur is by individuals trying to anticipate the effect of a policy and then taking actions that alter its outcome[2].
  • Kerckhoffs's principle A cryptosystem should be secure even if everything about the system, except the key is public knowledge.
  • Knuth's Law: Premature optimization is the root of all evil.
  • Linus's Law: Every bug can be found with enough code review
  • Littlewood's Law - based on the law of large numbers, if there are billions of people, a one in a million miracle level event will happen fairly often
  • Kryder's Law is the assumption that disk drive density, also known as areal density, will double every thirteen months. The implication of Kryder's Law is that as real density improves, storage will become cheaper.
  • Larman's Laws of Organizational Behavior has 5 parts, but the most interesting is: "any change initiative will be reduced to redefining or overloading the new terminology to mean basically the same as status quo."
    Also from that link: John Seddon observed this: "Attempting to change an organization’s culture is a folly, it always fails. Peoples’ behavior (the culture) is a product of the system; when you change the system peoples’ behavior changes."
  • Law of Triviality C. Northcote Parkinson's 1957 argument that members of an organization give disproportionate weight to trivial issues.[1] He provides the example of a fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues
  • Leyman's Law -- Software that does not evolve becomes less useful (entropy happens, software ages like milk, not wine)
  • Manny Lehman's Law already indicated that evolving programs continually add to their complexity and deteriorating structure unless work is done to maintain it (relates to Technical Debt).
  • Parkinson's law is the adage that "work expands so as to fill the time available for its completion". It is sometimes applied to the growth of bureaucracy in an organization.
  • Postel's Law (Robustness Principal) Be conservative in what you do, be liberal in what you accept from others (often reworded as "Be conservative in what you send, be liberal in what you accept").  The interesting related debate here.
  • Rice's Theorem -- hard to see that this theorem is all that relevant on a day-to-day basis, but it is good to know and keep in mind.  See: 10 reasons to ignore CS Degrees
  • Shannon's Law says that the highest obtainable error-free data speed, expressed in bits per second (bps), is a function of the bandwidth and the signal-to-noise ratio.
  • Shannon's Maxim Kerckhoffs' principle was reformulated (or perhaps independently formulated) by American mathematician Claude Shannon as "the enemy knows the system",[1] i.e., "one ought to design systems under the assumption that the enemy will immediately gain full familiarity with them".
  • Simon's Law - “An ant, viewed as a behaving system, is quite simple. The apparent complexity of its behavior over time is largely a reflection of the complexity of the environment in which it finds itself.” — Herbert Simon (Simon’s Law)
  • Stein's Law "If something cannot go on forever, it will stop," or "Trends that can't continue, won't." Here is a computer related usage (clock speed increase).
  • Thomas theorem “if men define situations as real they real in there consequences”. Robert K Murton, [predictions become integral part of the situation, peculiar to human affairs, prediction of return of Haley are not residences by men but bank runs are], see also:Popper self fulfilling prophecy, Oedepus effect, Macbeth effect?  See Soros Lectures
  • Zipf's law states that given a large sample of words used, the frequency of any word is inversely proportional to its rank in the frequency table. So word number n has a frequency proportional to 1/n. The same relationship occurs in many other rankings, unrelated to language, such as the population ranks of cities in various countries.
  • Ziv’s law"software development is unpredictable and that specifications and requirements will never be fully understood".  

Unknown Source
  • Software development is an act of discovery It is not that requirements changes, but rather the developers' and customers' understanding of the requirements evolve.  For a good discussion, see this post: The hard thing about software development.  

Theorems

  • CAP theorem also named Brewer's theorem after computer scientist Eric Brewer states that it is impossible for a distributed computer system to simultaneously provide more than two out of three of the following guarantees: Consistency. Availability. Partition tolerance.
Humorous but poignant "Laws"
  • Acheson's Rule of the Bureaucracy: A memorandum [email] is written not to inform the reader but to protect the writer.  More similar, here: http://www.workinghumor.com/murphy/bureaucracy.shtml
  • Atwood's Law: any application that can be written in JavaScript, will eventually be written in JavaScript.
  • Benchley's Law:  There Are Two Classes of People in the World; Those Who Divide People into Two Classes and Those Who Do Not
  • Braess' paradox proposed explanation for why a seeming improvement to a road network can impede traffic through it.  See: Predicting the Unpredictable
  • Fundamental Theorem of Software Engineering - any problem can be solved by injecting a layer of indirection.
  • Hanlon's Razor: Never attribute to malice that which is adequately explained by stupidity. (Chaining this principle with the Principle of Charity leads to a proper mental stance toward others that opens the mind to truly understand another's person's trigger/motivations).
  • Hofstadter's law is the observation that, no matter what it is, “It always takes longer than you expect, even when you take into account Hofstadter's Law.”
  • Norman's Law The day the product team is established, it is behind schedule and over its budget.
  • Sturgeon's Law 90% of anything is crap
  • "As a rule, software systems do not work well until they have been used, and have failed repeatedly, in real applications." Dave Parnas
  • AviD's Law of Regulatory Compliance: "PCI compliance reduces the risk of the penalties of non-compliance". (Good discussion)
  • Universal Law of Scalability
  • Wirth's Law states that computer software increases in complexity faster than does the ability of available hardware to run it. Another way of stating this is "Software slows down faster than hardware speeds up." 
Not sure how to relate to software:

Krulak’s law is simple: Soldiers in the field interacting with local people are the most important element of nation-building and counterinsurgency. It has wide applicability to any organization that interacts with the public. One errant interaction can ruin the operation.  Seth Godin: "The closer you get to the front, the more power you have over the brand."

Related Principles

Principle of Charity By default, give the other person the benefit of doubt; others are rational actors; have something to teach you; and are therefore interesting and you should give serious consideration to what they are saying.  If your default stance is oriented in the other way, you will miss out on chances to learn and connect with others in mutually beneficial ways.

Can be extended to being charitable to what others can accomplish (e.g.: like those that your for/with)--it may just be that you are not imaginative enough to see their value.

[To] make maximum sense of the words and thoughts of others [...] interpret [them] in a way that optimizes agreement

Principle of Charity can be restated as:
  1. The other uses words in the ordinary way;
  2. The other makes true statements;
  3. The other makes valid arguments;
  4. The other says something interesting.

As an aside, at least giving the other the appearance of being charitable will be useful to get the other to be charitable enough to listen to you to get they input from you that will help them understand where/how they went off the track.

Landauer's principle (from Wikipedia) is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment"

Useful analogies

Ship of Theseus or Grandfather's Axe -- if overtime you completely replace each part of a software application, is it still the same application? 


Interesting Paradoxes
Ellsberg paradox - From Wikipedia: they will always choose a known probability of winning over an unknown probability of winning even if the known probability is low and the unknown probability could be a guarantee of winning. For example, given a choice of risks to take (such as bets), people "prefer the devil they know" rather than assuming a risk where odds are difficult or impossible to calculate.

Simpson's paradox is a phenomenon in probability and statistics, in which a trend appears in several different groups of data but disappears or reverses when these groups are combined.



Mine this: https://www.exceptionnotfound.net/fundamental-laws-of-software-development/?utm_content=buffer03e05&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

Classic post that overlaps with the post mentioned above: http://scrum.jeffsutherland.com/2007/07/origins-of-scrum.html

List of eponymous laws on Wikipedia: http://en.wikipedia.org/wiki/List_of_eponymous_laws

Another list of laws: http://www.globalnerdy.com/2007/07/18/laws-of-software-development/

An even better list of laws: http://namcookanalytics.com/wp-content/uploads/2014/02/SoftwareLaws2014.pdf

Note "Al's Law" here: https://www.simple-talk.com/opinion/opinion-pieces/some-laws-of-software-development/

Article by George Gilder listing laws from a visionary book written in 2001:
http://www.kurzweilai.net/the-twenty-laws-of-the-telecosm

Flopping at software through the Fosbury flop -- as if, once everyone saw it, they started to do it to climb the stairs, same way that someone sees Google do something for the scale they need, and they start applying Google techniques to the software equivalent of stair climbing.

Mel Conway (@conways_law) tweeted at 4:15 PM on Fri, Apr 26, 2019:
Eponymous "laws" seem to fall into two categories:
(1) An ironic comment on the perversity of Nature, e.g., Murphy.
(2) A sometimes-verifiable formulation based on observation, e.g., Newton.

My own belief is that "Conway's Law" is a (2), but it seems that not everybody agrees.
(https://twitter.com/conways_law/status/1121870554305376257?s=03)

Searching for a law of software (or more generally perhaps products of any genre): a certain level of volume is required to get ROI given complexity, an economy of scale requires a focus relative to the volume.  As volume increases, complexity can be increased and focus decreased and profitability still maintained.  At low volume, unless the complexity is itself valued to solve a critical problem where supply is low and demand exceedingly great, one cannot be profitable.

The Twenty Laws of the Telecosm « Kurzweil (kurzweilai.net)

Any Law in these two paragraphs?
Automating any process creates a certain amount of overhead to manage, as it is no longer ad hoc and requires supervision to ensure it is done according to process and to meet the expectations embodied in the effort to automate it (payoff from the investment to automate is expected, and expected to exceed the cost of implementing the process plus the on-going cost to maintain the automation).

Choosing not to automate a portion of a "process" has an opportunity cost. in that, even that choice requires effort to maintain (next week someone well again try to automate it--and someone needs to say no, we thought about it, and here's why that is not a good idea in relative terms).

Need to mine this page:  http://pramodkumbhar.com/2019/07/summary-of-computing-laws-amdahl-dennard-gustafson-little-moore-and-more/