Steve Jobs by Walter Isaacson …
(… das Buch habe ich gerade ausgelesen.)
Just finished this book this early morning. As a rule: Never comment right away. Still, I’ll do that, but I’ll refrain from some sudden, personal judgements: comments related to my own oldfashioned beliefs (was his human rigor worth his products?), comments related to my age (the larmoyant thought that the world would loose so much wisdom with one’s death). So here some technical comments.
1. The author is deeply inbedded in the Apple story and in Silicon Valley. That lets him (and many other fans of Apple) see Apple products as Steve Jobs had wanted. He sees a slightly distorted “reality”. I remember when I bought an Ipod for Carla and tried it myself, I very much missed the off switch until she told me: “press at the bottom of the wheel”. Apple use is far from intuitive. But you only notice that when you really haven’t touched a device before. You learn so fast – especially if things aren’t really complicated –, you want to leave your embarassment behind so quickly, that you forget your frustrating first experience. I don’t blame Apple, I just think Apple fans overestimate its simplicity.
···2. Isaacson keeps defending the integrated architecture vs. open systems. He’s in the bag. I personally am out, free. Out of Apple, because it annoys me, this constant synchronizing with some i-cloud. i-don’t want it. (And Apple is overpriced as well.) I want to do my own thing, stay on top myself, decide on what’s happening and what’s in my gadgets, my PCs, my files, where they are, what they do, how connected they are. What overwrites what. My Psion 3mx is non-connected, since I have (1999) it never crashed, and there I store what I want to store there and on my PC what I want to have there.
···Anyway: I am and was away from Silicon Valley most of my life. I lived there two fine years, 1970 and 71, and remained very close to the Valley for many years thereafter, working for Hewlett-Packard and Tandem Computers in Europe. When I was invited to work on HP’s operating system back in 1973 to Cupertino, I had prefered to remain in Europe, my home, my heritage. So I never again was woven into this Bay Area, as I used to be, and as I had loved to be. I’m homesick of California until this day, but I’m glad to be here where I am now.
···Here, from Europe, from this Outside, we have a more remote view of Northern California and of Apple.
···3. I remeber many great computers apart from Apple. The first ones I loved where Grid computers, “GRiD” as they spelled. I had and have both the "Lite" model, design Winfried Scheuer, see picture at right, and the sturdy metal version. I used them extensively in the late 80s and in the early 90s. They were not so expensive that I could not offord them as a normal citizen – Apple-priced I’d say. See the Grid “Case” here.
···The next models I really enjoyed were Thinkpad Butterflys by IBM. Small and clever. I still use Thinkpads; this blog was started this morning on a X60s.
···I wrote numerous articles on and about these laptops, here is one.
···As I never ran around with earbuds, prefering my own self-induced moods to smooth external feelings, I did not follow musical devices that much. And if I had, I would not see them in connection with computers. Connected to computers I see television sets – early Prestel and then Bildschirmtext tried that, but so far nobody has come up with a real good combination. I expect Apple to do that presently, but then again in a non-open way, which probably won’t let me play my videos out of my digital camera.
···As to films, I’ve watched some on a PC, but didn’t really enjoy them as much as from a bigger screen, at least TV size. I didn’t find inserting a DVD into a reader under the TV set more cumbersome than into a PC – with or without slider. On other occasions however Apple drove me crazy when a CD didn’t reappear out of its slot. Where’s the hole for the pin?
···In other words: No grudge on Apple. But I prefer open systems. With all those sealed boxes and approved “apps” we killed all hardware and software curiosity and creativity. Nobody is able to open his or her calculator any more (I once used an HP 35 to build a phone charge counter via a simple optical link from the charging pulse to the “enter” key), nobody can program his hand held device any more (I have a couple of self-made programs on my Psion, and I have my old BASIC interpreters (1985) even on my wife’s Sony laptop with Windows 7. Closed systems make us stupid (“suck” as Jobs would say).
27. Januar 2012
22. Januar 2012
Dedicated to my old friends, especially to Bert Forbes, here some rembrances of my Cupertino days back in 1970. Bert was a senior hardware designer at Hewlett-Packard. We all were involved in bringing out the “HP 3000”, as the next model of our minicomputers was named. Bert was CPU design manager. He later left HP and founded Ziatech.
···My group of selected European systems analysts was supposed to learn all about the coming HP 3000. We would be trained in Cupertino and then run a demonstration and support “data center” in HP’s European headquarters, in Geneva. I arrived a bit earlier in Cupertino, I think it was spring of 1970, and set up accomodations, cars etc. for my collegues. They were, see brochure page 15, Marc Brun, Rainer Dern (hardware), Paul Gavarini, John Page, Björn Lindberg (our tallest, to be seen in the picture with the magnetic tape unit), Ray Woodcock, Erich Taschner, myself, plus specialists from the US like Bert Forbes (hardware) and Harlan Andrews. Returning from California late we run the Geneva HP 3000 data center from 1972 to 1974. Please correct me, if I remember wrong. Afterwards I’ve set up the data center at Böblingen, see pictures.
Demonstrating the HP 3000
Minicomputer, a now forgotten race, were square boxes with 16 bit hard and software. We thought in digital three bit numbers, which gives a choice from zero to seven, decimal eight becoming 70, beautiful. IBM’s hexadecimal bytes escalate to A, B, C, D, E and F, far away from us “minis”. To boot a mini you needed to toggle in an initial set of instructions that made the paper tape reader continue the process and load whatever you wanted to run.
···Hewlett-Packard competed against Digital Equipment, who had a broader range of minis up to the large PDP 10, and had a more dynamic instruction set including a hardware register stack. We were static.
···Minis were used for industrial applications, in medicine and military, but we all had fully overlooked any commercial use; a thing for IBM to do, and others like Control Data or Britain’s ICL, on mainframes, magnetic tape and punched card driven.···
···Hewlett-Packard had two computer divisions. The one in Loveland, Colorado, did “calculators” to control Hewlett-Packard’s many instruments, we in Cupertino made minicomputers, more like modular all purpose tech items, real computers (as we thought). In fact Hewlett-Packard missed noticing that they had a PC in their product line, perfect with all-in-a-box computing, ideal for business use. Just like the first “home computers” HP’s calculators had no operating system, but run Basic natively. You turned them an, and there was either an application that started on top of Basic or Basic. Basic commands like “goto” or “for” etc. where printed along the keys on the keyboard as typing shortcuts, I think. While Ataris were toys, HP calculators were serious stuff – and seriously expensive. Nobody even dreamt of calling them “personal computers”.
···HP 2116. At the “real” HP computer division in Cupertino Page Mill Road we had operating systems controlling the hardware. Models were HP 2116, later 2115 and the “budget” 2114, see ad: “Is it possible to get a really good computer for less than $10K?”, 4k×16 bit.
At first a “basic input output system” (BIOS) run a single stream of code, the program had to loop while waiting for a typed input or a character from the punch tape reader. SFS, skip if flag set, was probably the command excercised most of the time. (See an ad from 1970, an ad for a disc drive fom 1971). Then in 1969 a hardware interrupt system allowed the program to progress while waiting for the next character from the reader, a big step to speed it all up. A interrupt was just a hardware-forced jump-subroutine to a specific memory location. With the interrupts turned on, a “real time operating” system became possible. My group of systems analysts, then in Milan, was assigned to debug it, great intellectual fun.
···Interrupts allowed another feature: You could make one minicomputer work for more than one user. If you knew from which teletype the characters came in, you could assemble them to input data or commands strings in different bags for different users – up to 32 in those days. Initially this was done by a beautiful software driver shifting bits until it got to the correct user, later by a special board. As playing withe the operating system was reserved for the real experts (nowadys called “administrator”), the users commanded in Basic, hopefully not able to crash the system. Basic was general knowledge, and Basic was an interpreter, i. e. it let you know programming errors right away, not only after a lenghthy compiling process translating into machine code. Compilation (Fortran) took up to four passes with intermediate paper tapes on a 2114.
···Incidentally, my Basic interpreter, 55 kbyte of code, copyright Microsoft 1985, still runs on todays PCs!
···Back in the late 1960s and early 70s with Basic time sharing users in schools or administration typing along on their teletypes got connected directly or via 110 bit per second telephone modems to the central minicomputer system, initally a single computer with 64 kwords (à 16 bits, equal 32 kbyte), its adressable maximum, plus a disk. See ads for it from 1970 here and here.
···The HP 3000 was supposed to be Hewlett-Packard’s next computer model. Developement had progressed on a 32-bit computer named Omega, stack oriented, very dynamic; but to top management this seemed too big a step. So another, more humble project with just 16 bits, later called Alpha, was chosen for further life. It had been started by started by Mike Green, Alan Hewer and Bert Forbes, who comments today: “I don’t recall the original codename – HP3000 was chosen much later. It got renamed the Alpha and suddenly had many people from the Omega team trying to help with it. The expectations for what it needed to do increased by an order of magnitude to include batch, real-time and time-sharing. Quite a load for a sixteen bit computer of the day.” (I used to have a copy of the Omega specs. Probably can’t find them any more. Bert says there is a copy at the Computer Museum archives in Mountain View, CA.)
···The Alpha was very much state of the art, even bejond, and had relative addressing via pointers for the code, so various users would be running their own instance. The stack addressing allowed subroutines to be re-enterable. Data addressing in the planned Omega had been relative as well, but remained absolute in the Alpha, other than the stack. “And we made the classic mistake of ‘who'll ever need more than 64k (words)?’ That mistake has been repeated by every generation of computer designers from Bill Gates (640k bytes) to IP4”, so Bert Forbes today.
···I remember the Alpha had relativly large boards, horizontally stacked, and at one time you had to insert an indiarubber between two baords to make the machine run. Probably a connection was broken, unless the board was purpously warped. Bert Forbes knew how.
···The software was far behind schedule. It was supposed to do all the next advances from time share Basic, but in all directions at the same time: not just Basic, simultaneously more languages like Fortran, an assembler, and later even Cobol “for educational use”, not just 32 users, 64, etc. The system should be so fast that it would finish your typed in commands outrunning your fingers (like Google completes your searches while you type in). And so on, dreams, and as I like to say: A unborn child is everybody’s wish, both boy and girl.
···Finally the HP 3000 was due to be introduced at the 1971 Fall Joint Computer Conference in Anaheim, CA. My European group at that time was already working in Cupertino – see page 14 of www.Joern.com/FritzFolio.pdf – and got assigned to write the demonstration program. No operating system software was yet running on the machine, and if at all, it was slow, slow, slow with its obesity. However the time sharing round robin software for the terminals was up. So I suggested a “advertizing” demo. A screen page had 25 lines, if I rememer well, so we had marketing produce texts with 25 lines à 80 characters, 27 of them for A to Z. You typed in F, and some promotional text with F in the headine came up, right out of memory (we had no mass storage running either, but plenty of core memory), etc. The prototype HP 3000 was preloaded with 675 (27×25) punched cards into memory, and off it went to the show. The demo was super fast, outperforming any future reality.
···In Europe we never oversold the early HP 3000. With its RPG compiler it made a good competitor to IBM’s System/3, being able to run multiple threads for multiple users. So, back in Geneva with my group, we remained unshaken by the sales troubles in the US.
Permalink to here:
http://blogabissl.blogspot.de/2012/01/dedicated-to-my-old-friends-especially.html
Meeting in June 2017 http://blogabissl.blogspot.com/2017/06/california-revisited.html#friends
Time Table (hopefully correct)
late 1970 Data Center Milano
1971 and 1972 – Fritz Jörn in Cupertino, initially alone preparing for the teamn to come
July 1972 to November 1972 ― the European HP 3000 team is trained in Cupertino
November 1972 to February 1974 ― European HP 3000 support out of Geneva
···John Page left August 1973 and went back to England
1975 ― HP 3000 support from Böblingen
HP: “A decade of large scale innovation” – In my hindsight opinion HP’s separation and rivalry of computers (in California) and calulators (in Colorado) was tragic. HP never noticed that it had the perfect PC at hand: the calculator with its built-in Basic programming. If they just would have added Visicalc …
For some of my pictures of the early HP 3000 please see
http://picasaweb.google.com/Fritz.Joern/HP3000,
including a German brochure and price list from 1980.
The brochure and price list sepatately as PDF here.The HP 3000 demo data center in Geneva, a brochure
See http://en.wikipedia.org/wiki/HP_3000
History (by Bob Green): http://www.robelle.com/library/smugbook/classic.html
More history (by Christopher Edler): http://www.3k.com/index_papers_hp3000_history.html
Promotional brochure, 1971, “System Description”
HP “Museum”: http://www.3k.com/index_papers_hp3000_history.html – The picture up on the right is from there. A while the HP 3000 came in different colors, the panels could be individually designed, but this delayed sales decisions beyond reason.
As you see here, 4.9 MByte of disc memory cost $ 9975 back in 1971. My German price list of July 1980 shows a “7906M” 20 MB “Master” for 34550 DM plus HPIB »Adapter« for 2310 DM, a total of 36860 DM (€ 18846, appr. $ 23606), without tax.
A comment by Bob Strand to the picture with the many terminals (http://picasaweb.google.com/Fritz.Joern/HP3000#5700834382860552162): That picture with the multiple 3000 terminals was during the testing we did in Cupertino for Makro stores in The Netherlands. I moved to Amsterdam in December of 1975 and my job there was to consult with the Makro guys who were to write the software for the dual 3000 systems to run their stores in Holland, Belgium, The UK and Spain. This was to replace their current system that ran on 2100’s that had been developed by Dave Mackie [HP2100: see http://en.wikipedia.org/wiki/HP_2100 for good histrory. fj] It was soon apparent to me that they did not have the expertise to do the job. After months of struggling, things came to a head in a big meeting in Utrecht. During the first half of the meeting, they spent the entire time blaming their lack of progress entirely on me. If I had been a better consultant, they would have been further along than they were. After an hour and a half like this, all of us technical people were excused from the meeting and it continued with only the HP and Makro execs - Jan Schapers country manager from The Netherlands and the HP Europe manager (Dick Anderson?) who had come up from Geneva.
···After another hour or two the management meeting ended. Makro's conclusion after this part of the meeting was to turn the entire technical project over to me - the totally incompetent person from the first half of the meeting. They had used about five people for six months to get nowhere. I started from scratch and wrote the whole thing in about four months, all in SPL. (I actually still have the line printer listing of the program on green and white striped paper. Many hundreds of pages.) Once finished, HP sent a manager for me from Cupertino to take all the credit. I left and went to Tandem in February of 1977. I had started at HP in October of 1972 as an instructor in the HP-3000 training department. Met your [Fritz Jörn’s] team there and came to Geneva in the summer of 1973 with Unanski to teach the new release of MPE. You hosted the really nice graduation dinner for that class on my 31st birthday - July 18. That was my first ever trip to Europe.
···My group of selected European systems analysts was supposed to learn all about the coming HP 3000. We would be trained in Cupertino and then run a demonstration and support “data center” in HP’s European headquarters, in Geneva. I arrived a bit earlier in Cupertino, I think it was spring of 1970, and set up accomodations, cars etc. for my collegues. They were, see brochure page 15, Marc Brun, Rainer Dern (hardware), Paul Gavarini, John Page, Björn Lindberg (our tallest, to be seen in the picture with the magnetic tape unit), Ray Woodcock, Erich Taschner, myself, plus specialists from the US like Bert Forbes (hardware) and Harlan Andrews. Returning from California late we run the Geneva HP 3000 data center from 1972 to 1974. Please correct me, if I remember wrong. Afterwards I’ve set up the data center at Böblingen, see pictures.
Demonstrating the HP 3000
Minicomputer, a now forgotten race, were square boxes with 16 bit hard and software. We thought in digital three bit numbers, which gives a choice from zero to seven, decimal eight becoming 70, beautiful. IBM’s hexadecimal bytes escalate to A, B, C, D, E and F, far away from us “minis”. To boot a mini you needed to toggle in an initial set of instructions that made the paper tape reader continue the process and load whatever you wanted to run.
···Hewlett-Packard competed against Digital Equipment, who had a broader range of minis up to the large PDP 10, and had a more dynamic instruction set including a hardware register stack. We were static.
···Minis were used for industrial applications, in medicine and military, but we all had fully overlooked any commercial use; a thing for IBM to do, and others like Control Data or Britain’s ICL, on mainframes, magnetic tape and punched card driven.···
···Hewlett-Packard had two computer divisions. The one in Loveland, Colorado, did “calculators” to control Hewlett-Packard’s many instruments, we in Cupertino made minicomputers, more like modular all purpose tech items, real computers (as we thought). In fact Hewlett-Packard missed noticing that they had a PC in their product line, perfect with all-in-a-box computing, ideal for business use. Just like the first “home computers” HP’s calculators had no operating system, but run Basic natively. You turned them an, and there was either an application that started on top of Basic or Basic. Basic commands like “goto” or “for” etc. where printed along the keys on the keyboard as typing shortcuts, I think. While Ataris were toys, HP calculators were serious stuff – and seriously expensive. Nobody even dreamt of calling them “personal computers”.
···HP 2116. At the “real” HP computer division in Cupertino Page Mill Road we had operating systems controlling the hardware. Models were HP 2116, later 2115 and the “budget” 2114, see ad: “Is it possible to get a really good computer for less than $10K?”, 4k×16 bit.
At first a “basic input output system” (BIOS) run a single stream of code, the program had to loop while waiting for a typed input or a character from the punch tape reader. SFS, skip if flag set, was probably the command excercised most of the time. (See an ad from 1970, an ad for a disc drive fom 1971). Then in 1969 a hardware interrupt system allowed the program to progress while waiting for the next character from the reader, a big step to speed it all up. A interrupt was just a hardware-forced jump-subroutine to a specific memory location. With the interrupts turned on, a “real time operating” system became possible. My group of systems analysts, then in Milan, was assigned to debug it, great intellectual fun.
···Interrupts allowed another feature: You could make one minicomputer work for more than one user. If you knew from which teletype the characters came in, you could assemble them to input data or commands strings in different bags for different users – up to 32 in those days. Initially this was done by a beautiful software driver shifting bits until it got to the correct user, later by a special board. As playing withe the operating system was reserved for the real experts (nowadys called “administrator”), the users commanded in Basic, hopefully not able to crash the system. Basic was general knowledge, and Basic was an interpreter, i. e. it let you know programming errors right away, not only after a lenghthy compiling process translating into machine code. Compilation (Fortran) took up to four passes with intermediate paper tapes on a 2114.
···Incidentally, my Basic interpreter, 55 kbyte of code, copyright Microsoft 1985, still runs on todays PCs!
···Back in the late 1960s and early 70s with Basic time sharing users in schools or administration typing along on their teletypes got connected directly or via 110 bit per second telephone modems to the central minicomputer system, initally a single computer with 64 kwords (à 16 bits, equal 32 kbyte), its adressable maximum, plus a disk. See ads for it from 1970 here and here.
···The HP 3000 was supposed to be Hewlett-Packard’s next computer model. Developement had progressed on a 32-bit computer named Omega, stack oriented, very dynamic; but to top management this seemed too big a step. So another, more humble project with just 16 bits, later called Alpha, was chosen for further life. It had been started by started by Mike Green, Alan Hewer and Bert Forbes, who comments today: “I don’t recall the original codename – HP3000 was chosen much later. It got renamed the Alpha and suddenly had many people from the Omega team trying to help with it. The expectations for what it needed to do increased by an order of magnitude to include batch, real-time and time-sharing. Quite a load for a sixteen bit computer of the day.” (I used to have a copy of the Omega specs. Probably can’t find them any more. Bert says there is a copy at the Computer Museum archives in Mountain View, CA.)
···The Alpha was very much state of the art, even bejond, and had relative addressing via pointers for the code, so various users would be running their own instance. The stack addressing allowed subroutines to be re-enterable. Data addressing in the planned Omega had been relative as well, but remained absolute in the Alpha, other than the stack. “And we made the classic mistake of ‘who'll ever need more than 64k (words)?’ That mistake has been repeated by every generation of computer designers from Bill Gates (640k bytes) to IP4”, so Bert Forbes today.
···I remember the Alpha had relativly large boards, horizontally stacked, and at one time you had to insert an indiarubber between two baords to make the machine run. Probably a connection was broken, unless the board was purpously warped. Bert Forbes knew how.
···The software was far behind schedule. It was supposed to do all the next advances from time share Basic, but in all directions at the same time: not just Basic, simultaneously more languages like Fortran, an assembler, and later even Cobol “for educational use”, not just 32 users, 64, etc. The system should be so fast that it would finish your typed in commands outrunning your fingers (like Google completes your searches while you type in). And so on, dreams, and as I like to say: A unborn child is everybody’s wish, both boy and girl.
···Finally the HP 3000 was due to be introduced at the 1971 Fall Joint Computer Conference in Anaheim, CA. My European group at that time was already working in Cupertino – see page 14 of www.Joern.com/FritzFolio.pdf – and got assigned to write the demonstration program. No operating system software was yet running on the machine, and if at all, it was slow, slow, slow with its obesity. However the time sharing round robin software for the terminals was up. So I suggested a “advertizing” demo. A screen page had 25 lines, if I rememer well, so we had marketing produce texts with 25 lines à 80 characters, 27 of them for A to Z. You typed in F, and some promotional text with F in the headine came up, right out of memory (we had no mass storage running either, but plenty of core memory), etc. The prototype HP 3000 was preloaded with 675 (27×25) punched cards into memory, and off it went to the show. The demo was super fast, outperforming any future reality.
···In Europe we never oversold the early HP 3000. With its RPG compiler it made a good competitor to IBM’s System/3, being able to run multiple threads for multiple users. So, back in Geneva with my group, we remained unshaken by the sales troubles in the US.
Permalink to here:
http://blogabissl.blogspot.de/2012/01/dedicated-to-my-old-friends-especially.html
Meeting in June 2017 http://blogabissl.blogspot.com/2017/06/california-revisited.html#friends
Time Table (hopefully correct)
late 1970 Data Center Milano
1971 and 1972 – Fritz Jörn in Cupertino, initially alone preparing for the teamn to come
July 1972 to November 1972 ― the European HP 3000 team is trained in Cupertino
November 1972 to February 1974 ― European HP 3000 support out of Geneva
···John Page left August 1973 and went back to England
1975 ― HP 3000 support from Böblingen
HP: “A decade of large scale innovation” – In my hindsight opinion HP’s separation and rivalry of computers (in California) and calulators (in Colorado) was tragic. HP never noticed that it had the perfect PC at hand: the calculator with its built-in Basic programming. If they just would have added Visicalc …
For some of my pictures of the early HP 3000 please see
http://picasaweb.google.com/Fritz.Joern/HP3000,
including a German brochure and price list from 1980.
The brochure and price list sepatately as PDF here.The HP 3000 demo data center in Geneva, a brochure
See http://en.wikipedia.org/wiki/HP_3000
History (by Bob Green): http://www.robelle.com/library/smugbook/classic.html
More history (by Christopher Edler): http://www.3k.com/index_papers_hp3000_history.html
Promotional brochure, 1971, “System Description”
HP “Museum”: http://www.3k.com/index_papers_hp3000_history.html – The picture up on the right is from there. A while the HP 3000 came in different colors, the panels could be individually designed, but this delayed sales decisions beyond reason.
As you see here, 4.9 MByte of disc memory cost $ 9975 back in 1971. My German price list of July 1980 shows a “7906M” 20 MB “Master” for 34550 DM plus HPIB »Adapter« for 2310 DM, a total of 36860 DM (€ 18846, appr. $ 23606), without tax.
A comment by Bob Strand to the picture with the many terminals (http://picasaweb.google.com/Fritz.Joern/HP3000#5700834382860552162): That picture with the multiple 3000 terminals was during the testing we did in Cupertino for Makro stores in The Netherlands. I moved to Amsterdam in December of 1975 and my job there was to consult with the Makro guys who were to write the software for the dual 3000 systems to run their stores in Holland, Belgium, The UK and Spain. This was to replace their current system that ran on 2100’s that had been developed by Dave Mackie [HP2100: see http://en.wikipedia.org/wiki/HP_2100 for good histrory. fj] It was soon apparent to me that they did not have the expertise to do the job. After months of struggling, things came to a head in a big meeting in Utrecht. During the first half of the meeting, they spent the entire time blaming their lack of progress entirely on me. If I had been a better consultant, they would have been further along than they were. After an hour and a half like this, all of us technical people were excused from the meeting and it continued with only the HP and Makro execs - Jan Schapers country manager from The Netherlands and the HP Europe manager (Dick Anderson?) who had come up from Geneva.
···After another hour or two the management meeting ended. Makro's conclusion after this part of the meeting was to turn the entire technical project over to me - the totally incompetent person from the first half of the meeting. They had used about five people for six months to get nowhere. I started from scratch and wrote the whole thing in about four months, all in SPL. (I actually still have the line printer listing of the program on green and white striped paper. Many hundreds of pages.) Once finished, HP sent a manager for me from Cupertino to take all the credit. I left and went to Tandem in February of 1977. I had started at HP in October of 1972 as an instructor in the HP-3000 training department. Met your [Fritz Jörn’s] team there and came to Geneva in the summer of 1973 with Unanski to teach the new release of MPE. You hosted the really nice graduation dinner for that class on my 31st birthday - July 18. That was my first ever trip to Europe.
21. Januar 2012
Inflation als Heilmittel?
»Inflation löst keine Schuldenprobleme«, titelt die NZZ im Börsenteil vom 20. 1. 2012 (int. Ausgabe). Lediglich die implizite Staatsschuld [2010. D ca. 120% des BIP, A 220%, SP knapp 500%] lasse sich mittels Inflation reduzieren, also derjenige Schuldenberg, der Zusagen an das Renten- und Gesundheitssystem eines Landes berücksichtigt, meint Thorsten Hens.
···Noch deutlicher wird ein Meinungsartikel von Renate Ohr, »Euro-Rettung als Gefahr für die Europäische Union« siehe http://www.nzz.ch/nachrichten/wirtschaft/aktuell/euro-rettung_als_gefahr_fuer_die_europaeische_union_1.14389152.html. Sie schreibt u. a.: »Dieser Mangel an Glaubwürdigkeit der vereinbarten ›no bail-out‹-Regelung war die Ursache dafür, dass in den Schuldnerländern sowohl von staatlicher als auch von privater Seite zu viele Kredite mit zu niedrigen Zinsen aufgenommen werden konnten und die Gläubigerländer diese bereitwillig gaben, ohne dass eine produktive Verwendung gesichert war. Die Tatsache, dass mittlerweile nicht private Financiers, sondern die EZB über die Target-Salden letztlich die Leistungsbilanzdefizite der Schuldnerländer finanziert, macht die Sache nicht besser.« – Bitte den ganzen Artikel lesen! Und bei Bedarf lernen, was »Target-Salden« für feine Schweinereien sind.
···Ich meine, dass die Gelschöpfungsmechanik der Banken, bei der Kredite an Staaten (mit AAA-Rating) zum Kernkapital gerechnet werden und damit weiter fast beliebige Gelschöpfung erlauben, dass diese Mechanik an der Geldschwemme schuld ist. Eine Bank dürfte einem Schuldner, der verspricht, seine Schuld kurzfristig pünktlich und mit Zinsen zurückzuzahlen – wenn er denn bei ihr dann noch höhere Schulden wird machen können – keinen Pfennig leihen. Das gehört hinein in die Bankenregeln, in Basel III (das noch gar nicht in Kraft ist …). Aber damit würde sich die Spendierpolitik ja ins eigene Fleisch schneiden.
Ein Interview mit Renate Ohr: »Spaltung der Eurozone denkbar«
ZDF-Interview, Video 3'06", mit Renate Ohr, Hannover, links im Bild. Sendung Wiso, 23.05.2011 19:25
»Inflation löst keine Schuldenprobleme«, titelt die NZZ im Börsenteil vom 20. 1. 2012 (int. Ausgabe). Lediglich die implizite Staatsschuld [2010. D ca. 120% des BIP, A 220%, SP knapp 500%] lasse sich mittels Inflation reduzieren, also derjenige Schuldenberg, der Zusagen an das Renten- und Gesundheitssystem eines Landes berücksichtigt, meint Thorsten Hens.
···Noch deutlicher wird ein Meinungsartikel von Renate Ohr, »Euro-Rettung als Gefahr für die Europäische Union« siehe http://www.nzz.ch/nachrichten/wirtschaft/aktuell/euro-rettung_als_gefahr_fuer_die_europaeische_union_1.14389152.html. Sie schreibt u. a.: »Dieser Mangel an Glaubwürdigkeit der vereinbarten ›no bail-out‹-Regelung war die Ursache dafür, dass in den Schuldnerländern sowohl von staatlicher als auch von privater Seite zu viele Kredite mit zu niedrigen Zinsen aufgenommen werden konnten und die Gläubigerländer diese bereitwillig gaben, ohne dass eine produktive Verwendung gesichert war. Die Tatsache, dass mittlerweile nicht private Financiers, sondern die EZB über die Target-Salden letztlich die Leistungsbilanzdefizite der Schuldnerländer finanziert, macht die Sache nicht besser.« – Bitte den ganzen Artikel lesen! Und bei Bedarf lernen, was »Target-Salden« für feine Schweinereien sind.
···Ich meine, dass die Gelschöpfungsmechanik der Banken, bei der Kredite an Staaten (mit AAA-Rating) zum Kernkapital gerechnet werden und damit weiter fast beliebige Gelschöpfung erlauben, dass diese Mechanik an der Geldschwemme schuld ist. Eine Bank dürfte einem Schuldner, der verspricht, seine Schuld kurzfristig pünktlich und mit Zinsen zurückzuzahlen – wenn er denn bei ihr dann noch höhere Schulden wird machen können – keinen Pfennig leihen. Das gehört hinein in die Bankenregeln, in Basel III (das noch gar nicht in Kraft ist …). Aber damit würde sich die Spendierpolitik ja ins eigene Fleisch schneiden.
Ein Interview mit Renate Ohr: »Spaltung der Eurozone denkbar«
ZDF-Interview, Video 3'06", mit Renate Ohr, Hannover, links im Bild. Sendung Wiso, 23.05.2011 19:25
18. Januar 2012
Nochmal zur Bedeutung von Rating-Agenturen
Langsam scheint bei den hohen Wissenschaftlern durchzutröpfeln, dass das Rating einer Anlage etwas mit der Gelschöpfungsmöglichkeit einer Bank zu tun hat, und damit mit ihrer Fähigkeit (dem Staat) noch mehr nicht vorhandenes Geld zu pumpen. Professor Schmidt heute früh im Deutschlandfunk: »… und das ist ein sehr fataler Punkt: In ganz vielen rechtlichen Regelungen, in Vorschriften über Geldanlagen steht drin, dass dies und jenes nur zu geschehen hat, wenn ein Rating mindestens die Qualitätsstufe so und so hat, also Triple-A oder so. Und das ist fatal. Das sind Sachen, die hat man in den letzten Jahren eingeführt, ganz zentral natürlich in den Eigenkapitalvorschriften Basel II für die Banken, aber auch in ganz vielen Anlagevorschriften, etwa von Pensionsfonds, Investmentfonds und so weiter, steckt so etwas drin, und damit hat die Politik, haben andere Leute den Ratingagenturen ein Gewicht gegeben, was deren Urteil einfach sehr, sehr gewichtig macht. Das muss nicht sein!«
···Natürlich, Herr Professor, das muss sein und mehr noch. Kredite an Schuldner, die keinen Plan haben, ihre Schulden je zurückzuzahlen, es sei denn über eine neue (meist noch größere) Kreditaufnahme, dürfen nie und nimmer zum »Eigenkapital« der Bank gezählt werden.
···Es geht nicht darum, dass die Banken selbst zu faul wären, Risiken einzuschätzen. Es geht auch nicht nur um mehr oder weniger hohe Zinsen. Die »ge-ratete« Bonität bestimmt amtlich, wie intensiv die verliehenen (und von der Bank frisch geschöpften) Gelder von ihr mit Eigenkapital zu sichern sind, dereinst etwa mit 10,5 Prozent nach Basel III, siehe Bundesfinanzministerium: »Die Kernkapitalquote beschreibt das Verhältnis des Eigenkapitals einer Bank zu ihren riskobehafteten Geschäften, also zu den vergebenen Krediten und den getätigten Geldanlagen. Das Kernkapital soll in Finanzkrisen die Verluste abfangen, die es eventuell durch Kreditausfälle und Wertverluste bei Anlagen gibt. Basel III schreibt künftig eine harte Kernkapitalquote von 7 Prozent (hartes Kernkapital der Mindesteigenkapitalanforderungen 4,5 Prozent plus hartes Kernkapital des Kapitalerhaltungspuffers von 2,5 Prozent) vor. Hinzu kommt weiter weiches Kernkapital in Höhe von 1,5 Prozent und Ergänzungskapital in Höhe von 2 Prozent, so dass sich im Ergebnis die Eigenkapitalanforderungen auf 10,5 Prozent addieren.« Das ist ein »Hebel« von 100/10,5 = ca. Faktor 10. Beim Ausfall eines Kredits oder seinem Niedergang im Rating müsste also bei der Bank plötzlich zehnmal mehr Eigenkapital her. Kommt alles von der Geldschöpfung durch die Banken, die wir alle, vor allem aber die Staaten, als »Kreditwirtschaft« noch genießen. Übrigens sind die etwas strengeren Basel-III-Regeln noch gar nicht in Kraft; wir machen weiter so.
···Freilich können die Staaten per Gesetz den Banken vorschreiben, ihnen auf jeden Fall Geld zu leihen, oder einfach auf Ratings nicht mehr zu achten – und wir alle können den Kopf in den Sand stecken.
Langsam scheint bei den hohen Wissenschaftlern durchzutröpfeln, dass das Rating einer Anlage etwas mit der Gelschöpfungsmöglichkeit einer Bank zu tun hat, und damit mit ihrer Fähigkeit (dem Staat) noch mehr nicht vorhandenes Geld zu pumpen. Professor Schmidt heute früh im Deutschlandfunk: »… und das ist ein sehr fataler Punkt: In ganz vielen rechtlichen Regelungen, in Vorschriften über Geldanlagen steht drin, dass dies und jenes nur zu geschehen hat, wenn ein Rating mindestens die Qualitätsstufe so und so hat, also Triple-A oder so. Und das ist fatal. Das sind Sachen, die hat man in den letzten Jahren eingeführt, ganz zentral natürlich in den Eigenkapitalvorschriften Basel II für die Banken, aber auch in ganz vielen Anlagevorschriften, etwa von Pensionsfonds, Investmentfonds und so weiter, steckt so etwas drin, und damit hat die Politik, haben andere Leute den Ratingagenturen ein Gewicht gegeben, was deren Urteil einfach sehr, sehr gewichtig macht. Das muss nicht sein!«
···Natürlich, Herr Professor, das muss sein und mehr noch. Kredite an Schuldner, die keinen Plan haben, ihre Schulden je zurückzuzahlen, es sei denn über eine neue (meist noch größere) Kreditaufnahme, dürfen nie und nimmer zum »Eigenkapital« der Bank gezählt werden.
···Es geht nicht darum, dass die Banken selbst zu faul wären, Risiken einzuschätzen. Es geht auch nicht nur um mehr oder weniger hohe Zinsen. Die »ge-ratete« Bonität bestimmt amtlich, wie intensiv die verliehenen (und von der Bank frisch geschöpften) Gelder von ihr mit Eigenkapital zu sichern sind, dereinst etwa mit 10,5 Prozent nach Basel III, siehe Bundesfinanzministerium: »Die Kernkapitalquote beschreibt das Verhältnis des Eigenkapitals einer Bank zu ihren riskobehafteten Geschäften, also zu den vergebenen Krediten und den getätigten Geldanlagen. Das Kernkapital soll in Finanzkrisen die Verluste abfangen, die es eventuell durch Kreditausfälle und Wertverluste bei Anlagen gibt. Basel III schreibt künftig eine harte Kernkapitalquote von 7 Prozent (hartes Kernkapital der Mindesteigenkapitalanforderungen 4,5 Prozent plus hartes Kernkapital des Kapitalerhaltungspuffers von 2,5 Prozent) vor. Hinzu kommt weiter weiches Kernkapital in Höhe von 1,5 Prozent und Ergänzungskapital in Höhe von 2 Prozent, so dass sich im Ergebnis die Eigenkapitalanforderungen auf 10,5 Prozent addieren.« Das ist ein »Hebel« von 100/10,5 = ca. Faktor 10. Beim Ausfall eines Kredits oder seinem Niedergang im Rating müsste also bei der Bank plötzlich zehnmal mehr Eigenkapital her. Kommt alles von der Geldschöpfung durch die Banken, die wir alle, vor allem aber die Staaten, als »Kreditwirtschaft« noch genießen. Übrigens sind die etwas strengeren Basel-III-Regeln noch gar nicht in Kraft; wir machen weiter so.
···Freilich können die Staaten per Gesetz den Banken vorschreiben, ihnen auf jeden Fall Geld zu leihen, oder einfach auf Ratings nicht mehr zu achten – und wir alle können den Kopf in den Sand stecken.
13. Januar 2012
Tourbillon
Tourbillon, sprich’s französisch aus, Turbijon, und um Gottes Willen nicht englisch, das ist ein Wirbelwind, eine Windhose, ein Strudel oder bloß ein Wärmegewitter.
···Liebe und Lust kann auch dabei sein, wie bei dem ersten Tourbillon, das mir in Erinnerung ist, 1962, dem Toubillon De La Vie in François Truffauts Film Jules et Jim, gesungen von Jeanne Moreau (am 23. 1. 12 wird sie 84):
Im Netz und speziell auf Youtube findet man zahlreiche Erinnerungen an diesen Kultfilm, sogar mit Übersetzungen, etwa hier.
···Viel ruhiger dreht sich Breguets Tourbillon in einer mechanischen Uhr, erfunden 1795, um Gangungenauigkeiten durch Herumdrehen von Unruh und Hemmung einmal in der Minute auszugleichen. Einst war das bei meist in gleicher Lage getragenen Taschenuhren vielleicht hilfreich, heute ist ein Tourbillon nur mehr eine teure Komplikation, Glanzlicht einer ohnehin überholten Technik.
Aufnahme von Eric Kilby. Eine Minute Tourbillon auf einer Stührling-Armbanduhr, Modell Imperium Tourbillon, ca. US-Dollar 2000, auf Amazon von 750 (and. Modell) bis 1025, als Billigkopie nicht zu haben ...
Tourbillon, sprich’s französisch aus, Turbijon, und um Gottes Willen nicht englisch, das ist ein Wirbelwind, eine Windhose, ein Strudel oder bloß ein Wärmegewitter.
···Liebe und Lust kann auch dabei sein, wie bei dem ersten Tourbillon, das mir in Erinnerung ist, 1962, dem Toubillon De La Vie in François Truffauts Film Jules et Jim, gesungen von Jeanne Moreau (am 23. 1. 12 wird sie 84):
Im Netz und speziell auf Youtube findet man zahlreiche Erinnerungen an diesen Kultfilm, sogar mit Übersetzungen, etwa hier.
···Viel ruhiger dreht sich Breguets Tourbillon in einer mechanischen Uhr, erfunden 1795, um Gangungenauigkeiten durch Herumdrehen von Unruh und Hemmung einmal in der Minute auszugleichen. Einst war das bei meist in gleicher Lage getragenen Taschenuhren vielleicht hilfreich, heute ist ein Tourbillon nur mehr eine teure Komplikation, Glanzlicht einer ohnehin überholten Technik.
Aufnahme von Eric Kilby. Eine Minute Tourbillon auf einer Stührling-Armbanduhr, Modell Imperium Tourbillon, ca. US-Dollar 2000, auf Amazon von 750 (and. Modell) bis 1025, als Billigkopie nicht zu haben ...
Abonnieren
Posts (Atom)