Maestro I
Maestro I was the world's first integrated development environment for software.[1] It was developed by Softlab Munich.
Softlab Munich originally called the software Program development system terminal (PET), but renamed it after Commodore International introduced a home computer called the Commodore PET in 1977.
At one time there were 22,000 installations worldwide. The first USA installations were at Boeing in 1979 with eight Maestro I systems and Bank of America with 24 system and 576 developer terminals.[2]
Until 1989 there were 6,000 installations in the Federal Republic of Germany . Maestro I was the world leader in the field in the 1970s and 1980s.
Maestro I holds a significant place in the history of technology.
One of the last Maestro I systems is at the Museum of Information Technology at Arlington.[4]
First presentation in 1975
Harald Wieler, copartner of Softlab Munich, developed the first prototype of the system, then named PET, in 1974 based on the Philips X1150 data collection system. Originally a Four Phase System from the USA. Wieler was architect and programmer of the mainframe DOS operating system development, for Siemens licensed by Radio Corporation of America for Siemens. The objective in developing Maestro I was a hardware and software programming tool rentable for 1000 Deutsche Mark a month, about the same as a one family house in the Munich area at the time.
Introduction
“ | … I am offering an opinion that will or should not change, because it is consistent, possesses eternal validity or because it is worth maintaining.[5] I would like to limit myself to Software Engineering, a subject that I believe to understand somewhat. In the following, we will discuss the constant factors or, as I will call it, 'the Invariables of Software Engineering'. | ” |
Maestro was an essential factor in the development of:
- Software engineering
- Origination of development environments
- Human-computer interaction, ergonomics
- Methodology (software technology)
Historical context
In order to understand the impact of Maestro, one has to understand the way programmers worked until about 1975. They would enter their code and test data in paper tape or punched cards. After finishing the punching, the programmer would feed the tape and/or the cards in the computer.
The introduction of the IBM 3270 terminals together with IBM’s ISPF (Interactive System Productivity Facility) constituted a real improvement. The text editor that was integrated in ISPF allowed source code for programs to be entered in real time. The editor was controlled with commands, line editing and function keys. The disadvantage was that the reaction to input would only appear after a whole page had been entered which made the operation slow and not very intuitive.
Psychological phenomenon
A delay in a dialogue operation causes an involuntary break in the thinking process and thus in the programmer's work. Fred Brooks calls this phenomenon Immediacy in the landmark paper No Silver Bullet. This is caused by the way short-term memory functions in the brain. Atkinson & Shiffrin proposed a model in 1968 that stipulates that information entering short-term memory fades away in 18–20 seconds if it is no longer attended to. Another important factor is the recency effect which causes a person to remember the last few things better than the things in the middle or the beginning of a time period. Thus, when delays occur in the work, the programmer tends to lose the thread of his or her thoughts.
The introduction of Maestro was considered a real innovation in its time. According to the economist, Joseph Schumpeter, innovation consistes of the acceptance of a technological or organizational novelty, more so than its invention. In the case of Maestro, the "discovery" of short-term memory was turned towards a more technical application. Maestro fed each keystroke directly to the CPU producing immediate feedback. This feedback was also enabled by the particular characteristics of the hardware, specifically the use of a keyboard and console instead of the earlier punchcards or tape.
A comparison with another innovation such as Ajax (programming) is justified. The name Ajax could regularly be found in the media in 2005, as Google used its asynchronous communications paradigm in interactive applications such as Google Maps. Web applications would traditionally work with forms that need to be completed by the user. The IBM 3270 terminals from the 1970s also worked with "forms" (actually screens) that needed to be completed leading to delays and disturbing breaks in the work. Maestro remedied these delays, similarly to Ajax some thirty years later.
Milestones
1975: Introduction
The first prototype of PET was developed by Harald Wieler of Softlab, based on a Philips X1150 data collection system (actually a Four-Phase IV/70 system, made in the USA). Wieler worked as architect (and programmer) of operating systems for mainframe computers of RCA and Siemens before joining Softlab.
The development of Maestro was co-funded by the German government. The target was to create an interactive programming terminal for 1.000 Mark (approx US$500) per month.
"The creator of the Maestro program is an American. But Harald Wieler, 45 years old, has German parents. After completing his studies (Physics), he wanted to get acquainted with the country of his ancestors and found employment in a research laboratory with Siemens in Munich. He met his wife at the home of Bavarian friends of his mother and decided to stay in Germany. He became a co-founder of Softlab in 1971."
"Softlab’s charming specialist, Ms. Christiane Floyd PhD, demonstrated the Program Development System PET on the companies’ stand to a large number of experts."
1977: Connection to Mainframe computers
"The release of data communication procedures for the linking of the PET-hardware, the Philips X1150 (data entry system) with IBM S360/370 and Siemens 4004/7000 completes the development activities by the Software company Softlab of Munich."
1978/79: Export to the USA
“ | The first US customer was the Boeing Company, the aerospace and Defense Corporation with 7 systems. The biggest purchaser became the Bank of America who ordered 24 Maestro-computers with 576 terminals for its 10.000 programmers in their San Francisco computing center. Softlab founded a US branch which sold about 100 Maestro systems with some 2000 terminals in the US.[6] | ” |
“ | The Itel Corporation will offer the dedicated ‘Program producing system’ Maestro on the world market outside of Europe but especially the USA on an exclusive basis. ‘We have reached the break-even point in Europe with 1200 installations’, stated a Softlab spokesperson in Munich, ‘we are now hoping for a big increase in sales’….. The PET project team is confident in its chances on the US market, although it will not be easy for the German software industry to get a foothold in that market. Itel estimates a potential of 150.000 to 200.000 programming workstations in North America. The 1500 to 2000 existing Itel-AS installations will form a solid starting point. ‘We strive after a leading role in this market niche’, said Bob Cabanisz, VP data products group of Itel. | ” |
1980: Interactive training
"’There is much more education than knowledge in the world’, wrote Thomas Fuller already in 1732. Learning is a mental activity and its efficiency has always been fairly low. The same is shown, some 350 years later, for the poor production results of a modern form of mental activity: software development. This, at least, is the opinion of Rita Nagel, Softlab GmbH. Munich who also believes that this need not be the case. Software Company Softlab has developed an interactive program development system called PET/X1150 which has rationalized the mental activity. It made therefore sense to include training facilities for the PET-users within the same tool."
1982: Connection with IBM TSO, IMS and CICS
Geza Gerhardt, manager of the communications group at Softlab extended[7] the IBM3270 simulation in Maestro in 1982. This allowed further off-loading of computer processes from mainframe to dedicated systems.
"The system now offers extended interactive support for design, documentation and testing as well as project management. Next to 3270-BSC dialog, SDLC/SNA is now also supported. Parallel connections to TSO, IMS and CICS are also possible."
Technology
Hardware
The basic system was a "key-to-disc" data entry system. Historical predecessors were "key-to-tape" systems such as the Mohawk Data Recorder, Olympia Multiplex 80 and Philips X1100.
Maestro used the Philips (Apeldoorn, the Netherlands) X1150 Data Entry system, which was built on a Four-Phase (Cupertino, California) IV/70 processor.
A typical configuration at the time of introduction[8] was:
- System with 96-192 KB RAM
- 6-24 (dumb) terminals
- 10- 80 MB disc
- Magnetic tape
- Line printer (various types and models were supported)
- Data communication connection
"In the Four-Phase System IV/70 the memory and control requirements of up to 32 keyboard display terminals are combined with the mainframe memory and logic of the Central Processing Unit. As a result, data is displayed directly from refresh areas of the Four-Phase Systems parallel-accessed LSI memory, eliminating the cost of separate buffer memories in every terminal. Using this technique, exceptionally high video throughput results, enabling new information to be displayed at a rate of 395,000 characters per second."
The hardware evolved over time: the Four-Phase IV/70 processor was replaced by the more powerful Four-Phase IV/90 system and more terminals, memory and disc capacity could be supported. The base Philips X1150 Data Entry system was rebranded as Philips P7000 Distributed Processing System as significant additional functionality was added.
Software[9]
The operating system was a proprietary Four-Phase Disc Operating System (rebranded by Philips) which supported the usual components at that time: text editor, assembler, various compilers, and linkage editor.
The Four-Phase software offer consisted of packages for:
- Data Entry (key-to-disc)
- 3270 emulation
- 3270 emulation with programming facilities
- This unique package allowed the user to include local programming to off-load the mainframe
- COBOL
The original PET/Maestro software made extensive use of existing libraries from the above packages.
Operation
1974: Structured Programming
"One of the cornerstones of modern methods in Software technology was Structured programming. This methodology became obligatory for all program development at Softlab in Munich. Peter Schnupp PhD and one of the founders of Softlab, but also Associate Professor and author of many professional publications, considered Structured Programming to be the ‘Return of common sense’".
The founders of Structured Programming, Prof. Edsger Dijkstra and Sir Charles Hoare, were keynote speakers at a meeting for software specialists at the Max Planck institute in Munich in December 1974. Peter Schnupp PhD was the president of the ACM at that time and presented a lecture with the content above.
1978: Is the life of COBOL eternal?
"Even if the new program languages would be considerably better than the existing ones, their widespread use would not be certain because of the lack of need with the prospective users. During the design of the new languages, decisions are often made which may bring advantages in scientific institutions but disadvantages in industrial software production. These problems often offset the advantages versus the older designs."
1980: Art, Manual Labor or Science?
"Structure and originality are not necessarily exclusive of each other. This will be proven in the following:
- There are people who are against structures because they harm the originality.
- There are people who are against originality because they limit the usability and the maintainability of software products.
- There are people who are in favor of originality, because only then is it possible to realize creativity in programming.
- And, finally, there are those that are against tools that enforce structure, thus excluding originality and blocking ‘self-realization’.
Who is right and who is wrong? Everyone! It depends on your definitions of structure and originality. And, of course, it depends on the use of the right tools.
As it is still not clearly decided whether programming is art, manual labor or science – probably a bit of all –it is necessary to discuss all three aspects".
Contemporaries
How realistic is Software Technology?
Introduction to the history of Maestro (The effect of psychological mechanisms on Software technology)
"The author has not been witness yet to any larger and successful software project that was executed according to the rules of software technology for more than one third of the projected duration. Or that was explicitly discussed, specified and planned without programming of critical system parts, modeling or similar …."
"On the other hand, there is no reason to disregard a successful project in which the most basic rules for (extensive) specification of the coding were totally neglected. …
This project was the PET system, to be regarded as Germany’s, and perhaps the world’s most successful software development tool at that time. The first version of PET was started about 4 months before it was introduced at the Hannover Fair. And, even more so, it was more or less "fumbled" in the software of the Philips X1150 Data Entry system. And that was done as an add-on to existing components of the base system, not even as separate ad-hoc programs. This method had the advantage that the systems to be developed existed from day one so that developers would never be separated from the reality: in the end, they developed their system with the system itself which forced them constantly on the real requirements of their environment."
Invariants of software Engineering
"Prognoses are … a favorite theme in our profession’s press and their editorials. They like to speculate about how client/server systems will replace the mainframes, that Java is the programming language of the future or who e-commerce will change the economy. But, they never reflect about their predictions from yesterday, one year or even 5 years ago – it would be disgraceful and probably not very interesting to anyone. But, it would be an educational experiment, to record once a year, what the changes in Informatics have been over the last two, five and ten years. At the same time, one should reflect how the prognoses from last year have turned out – a good training on your judgment capabilities, mostly a disillusion on one’s capabilities of prediction. If this experiment would seem to laborious, one could replace this with self-reflection: What did I expect in 1980 for the state-of-the-art in 1985, or in 1985 about 1990 etc. "
References
- ↑ Computerwoche: Interaktives Programmieren als Systems-Schlager, 1975/47
- ↑ Der Spiegel, 17 January 1983, Page 71 Akten auf Knopfdruck
- ↑ Image credit: Museum of Information Technology at Arlington http://mit-a.com/fourphase.shtml
- ↑ Image credit: The Museum of Information Technology at Arlington - Four Phase IV/90
- ↑ RvG: This is the best translation that I can do, but it is still not very good as the original text is equally vague. I would recommend changing the quotation.
- ↑ Der Spiegel, Jan. 17, 1983, page 71
- ↑ RvG: I changed the sentence from realized to extended as the actual IBM3270 emulation software was developed and introduced by Four-Phase System and already part of the original PET software far before 1982.
- ↑ RvG: I extended and changed the original text based on my recollection of that period
- ↑ RvG: new text; i.e. not translated
External links
- Christiane Floyd http://swt-www.informatik.uni-hamburg.de/people/cfl.html
- Peter Schnupp to the story of Maestro I
- IEEE History Center: Ernst Denert Interview (29 June 1993)
- - Museum of Information Technology at Arlington - Four Phase
- Four Phase System a multi-terminal display- processing system