Abstract / bottom line:
The funding of the new and heavy non-profit research company OpenAI is (half a year after my first analysis) a further opportunity to voice my criticism against questionable approach to the AI topic. 
I think valid beyond all question is this:
The world demands the right to govern and restrict AI in its global political and social consequences ! (Humanity demands, not to leave AI as a question of companies, national interests and technology.)
Steven Levy outlines in his Medium article how OpenAI was formed and he interviewed Elon Musk and Sam Altman on this topic.
I use the answers in the interview (repeated here shortened and analogously) to place my comments. (The order of numbering correlates not with that of the interview.)
OpenAI statement 1: We don't want the one single entity that is a million times more powerful than any human.
OpenAI statement 2: An oversight over development of AI? - We are super conscious of safety and possible bad AI. If we do see something that we think is potentially a safety risk, we will want to make that public. 
My verdict on OpenAI (more fair: on what was told in the interview) :
At almost no point Altman and Musk did tell us the truth. – It represents normal US-American egocentricity when they simply do what is best for themselves and expect, the whole world will believe this is a bounty to the world.
But facing "2045" will probably be Doom to the world, it is not.
- The superintelligence control problem (Daniel Dewey) .. 2015-10-20
- Twitter/ Daniel Dewey
- Medium/ AI Control (Paul Christiano)
- Potential risks from advanced AI -1- -2- .. 2016-05-06
- Recode's Code 2016 conference (sadly no YEAR filter) .. 2016-06-02
- OpenAI technical goals .. 2016-06-20
- HN discussion on the OpenAI post above .. 2016-06-20
- Bringing precision to the AI safety discussion .. 2016-06-21
- Concrete Problems in AI Safety (PDF, arXiv.org) .. 2016-06-21
- Concrete Problems in AI Safety (OpenAI) .. 2016-06-21
- HN discussion on the OpenAI post above .. 2016-06-21
 The background image uses an illustation from Pe ter-Michael Carr uthers