2. Descriptions of Standards
5. Case study of the latest programming language Standard - Ada 95
7. The Future
Programming Languages. Industry Standards. The past, the present and the future. The furious advancement in technology in the world of information technology today has prompted various languages being developed to cater to the needs of today's industries. Why? For the sole reason that we are moving into the 21st century and almost everyone will agree that information processing will affect every facet of modern society. Welcome to the information era.
The past, since mid 1940's , saw the birth of programming languages, micro codes for doing simple functions calculating arithmetic problems. Did the programmers of the past ever imagine the effect and importance of programming languages today ? Did they ever foresee how much these languages would revolutionise and evolve into what they are today ? How did they evolved? Where does standards come into the picture? Did the companies of the past ever expect or predict that the computers and communications industry as a whole, would spark a multi-billion dollar industry ?
Previously, we have investigated matters like "Standards and relations to programming languages" , "An introduction to an industry standard programming language C++" , "A journey through programming language generations" and "What makes a programming language popular?" in our articles leading to this final SURPRISE 96 report. In this report, we will be looking at how programming languages developed, their features during their evolution through the generations, beginning with the importance of standards and how they come about in the area of programming languages and more, in hope of shedding some light of what the next industry standard should have or be and what the future holds for us.
What's this thing called Standards eh?
Standards are basically documented agreements containing technical specifications or other precise criteria to be used consistently as rules, guidelines or definitions of characteristics, to ensure that materials, products, processes and services are fit for their purposes. Standards provide a medium for every part of an organised society to agree, with practical considerations in mind, in the development required in product and process design, manufacturing and distribution, in meeting or exceeding the defined existing standard in order to maintain the high quality and uniformity throughout the industry. The computer industry in a fine example of technology that needs quickly and progressively to be standardised at a global level. The descriptions of Standards is in hope to educate future innovators, possibly pioneering in the field of programming languages to understand the nature, development stages and importance of standardisation.
Standardisation programmes in completely new fields are now being developed like in the field of advanced materials, the environment and in urbanisation and construction. There is no doubt that standards play an important role at both national and international level. Industry wide standardisation, a condition existing when large majority of products conform to the same standards, is the result of consensus agreements reached between all players of the industry like suppliers, consumers, military and governments. Standards issued by an internationally recognised body such as ISO (International Organisation for Standardisation) are regarded to be accepted by organisations world- wide. The IEC works very closely with the ISO in a collaborative manner in the matter of international standardisation due to an agreement in 1976. A joint mission statement in 1994, introduced the International Telecommunications Union (ITU) as another important partner in international standardisation. Together, the ISO, IEC and ITU, co-ordinate and operate the voluntary standardisation process at international level. Voluntary consensus based standards are developed at sub-national, national, like the American National Standards Institute (ANSI) and British Standards Institute (BSI), regional and international level. For the information technologies field, which overlaps significant segments of interest for each organisation, ISO and IEC have established a joint technical committee (JTC 1) for which common and co-ordinated working procedures have been established with the Telecommunications Standardisation Sector of the ITU (ITU-T).
In 1906, an International Electrotechnical Commission (IEC) was created for drafting standards in the Electrotechnical field. The IEC is still a dominant force in the world of standardisation today. Standardisation for other pioneering fields like mechanical engineering were handled by the International Federation of the National Standardising Association (ISA) set up in 1926. The activities of the ISA stopped due to the Second World War in 1942. A meeting in London in 1946, with delegates from 25 countries made the decision to set up a new international standards organisation with the objective to "facilitate the international co-ordination and unification of industrial standards" officially established on 23rd February 1947 and appropriately called ISO.
The name ISO is not actually the acronym for the International Organisation for Standardisation, rather, it was derived from the Greek word "isos" meaning equal. From "equal" to "standard", the line of thinking that led to the choice of "ISO" as the name of the organisation is easy to follow. ISO is a non governmental organisation with a mission to promote the development of standardisation and related activities in the world. They have a common view to facilitating the international exchange of goods and services, and to the developing co-operation in the spheres of intellectual, scientific, technological and economic activity. This body work results in international agreement which are then publicly released via publications as International Standards.
Why do we need standards ?
In a non standards world, similar technological advancement by different countries or developers but not following a standard guideline can contribute to technical barriers to trade. This means there is no portability of near similar products. The thickness of credit cards, 76mm, is an example of standardisation as it is used internationally and now there is no ambiguity when developing card readers for cards with a universal thickness. Export minded industries have long realised the need to an agreement of world standards to rationalise the smoothness of international trading. This was another reason for the formation of ISO.
As today's market trends are moving towards services based industries, as opposed to manufacturing, countries have adopted the open market strategy where diverse sources of supply strengthens the need for standardisation. Technologies has to be identifiable , with clearly defined common references recognised from one country to another, developed by consensus and serving as a language of trade. Emerging developing countries and technologies can use standards to improve productivity, compete in the international markets and increase the export capability of their products to achieve sustainable development. In the computer industry, full compatibility among open systems and languages fosters a healthy competition amongst program developers and producers, offering wide options to users, considering it is also a powerful catalyst to creative development of other innovative products, thus improving productivity and cutting cost.
what becomes a standards?
The ISO is made up of representative of national standardisation bodies in its own country (now from over 100 countries world-wide) and only one such body for each country can be accepted to become a member. These members have principal tasks like informing international standardisation opportunity and initiative to interested parties in their respective countries. They must also take the responsibility to collect and organise their country's views and interest and present it during the international negotiations leading to standards agreements. They can and in most cases, must provide a secretariat for technical committees and subcommittees within the ISO for areas they are interested in. Since the ISO is a non governmental body, financial support for the running of this body is generated through sponsors and payment of membership dues.
The ISO in brief is made up of a general Assembly, a Council , a President, a Treasurer, a Secretary General, a Central Secretariat, Technical Committees and its divisions. The technical work of reviewing and investigating proposals are carried out by the Technical Committee (TC) and Subcommittees (SC). The ISO Council decides if a TC should be established for any program of work. A technical agreement between all member bodies of the ISO makes up an International Standard. This follows a lengthy process, mostly due the decentralised nature of the ISO, which means all working documents must be done by correspondence, where formal meetings convened only when really justified.
The development of standards follows six stages, in order they are
i: Proposal stage
ii: Preparatory stage
iii: Committee stage
iv: Enquiry stage
v: Approval stage
vi: Publication stage.
i: The Proposal Stage
The first step in the process of standardisation is to confirm if an International Standard is needed. The new item is then submitted to be voted by members of the TC/SC to decide if it should continue with the following stages. A project leader is appointed this time.
ii: Preparatory Stage
A working group of experts, a chairman (usually the project leader) is set up by the TC/SC for the preparation of the working draft. Improvements and updates in these drafts are produced until this working committee is satisfied that the best technical solution for the problem at hand is addressed. This is then forwarded to working group's parent committee for the consensus building phase.
iii: Committee Stage
When the first committee draft is available, it is registered by the Central Secretariat of the ISO. It is passed around, distributed for comments, voted on if successive drafts are needed until consensus is agreed on the technical content. Once attained, the text is finalised and submitted to be referred as a Draft International Standard (DIS).
iv: Enquiry Stage
Now this DIS is circulated amongst all ISO member bodies for voting and comments within a period of five months. If two-third of the P-members from the TC/SC are in favour and not more than one-quarter of the total number of votes casted are negative, then this draft is submitted to be the Final Draft International Standard (FDIS). This new tag is given and registered with the CS and from now on until it is finally recognised, this draft will be referred to FDIS.
v: Approval Stage
This FDIS is circulated again, any amendments and revisions are noted and registered for consideration for the next IS, for the last time to get the final Yes/No vote within two months. The FDIS is approved if the same quorum as mentioned above agree to it. The FDIS is returned to the TC/SC for changes possibly due to new technical reasons reflected by the negative votes casted.
vi: Publication Stage
Once this FDIS is approved with only minor editorial changes needed if necessary, the final text is sent to the ISO Central Secretariat to be published and registered as an International Standard. All IS are reviewed every five years by the TC/SC responsible for its developments and this committee will decide if it should be confirmed, revised or withdrawn.
We will see an example of the development of the programming language standard Ada and the rationalisation behind it.
A revolution is taking place in computer languages. The revolution is desperately needed. We need to be able to instruct computers much more easily and more quickly than we have been able to in the past. There are two reasons for this .
First, computers are increasing in quantity and at a rapid rate . In ten years' time there will be many millions of personal computers, and some of these will be very powerful. Data processing computers will increase in speed by a factor of 40 or 50 in ten years ,and minicomputers producing ever faster chips. The annual production of digital functions will increase by about 300 times in the next ten years and most of these chips will operate at much higher speeds . Therefore the estimates of future computing power will be the productivity of application development must increase by at least two orders of magnitude over the next ten years. This cannot happen if computers continue to be programmed with such languages as COBOL , PL/I , Pascal , C or Ada.
Second , as computers spread , many people who are not data processing professionals must
be able to put computers to work . Applications development without professional programmers
is becoming a vigorous trend in computing . Application will increasingly be created by
end users , business consultants , and systems analysts . Systems analysts need powerful
computer languages with which they can quickly build their own applications. The
system analysts main concentration must be on the business or application , not on the
intricacies of coding . A systems analyst should be able to concentrate on the subject
matter and put computers to work with powerful application-building tools that , once
learned , require little mental effort .
End users should also be able to build their own computer applications. They need languages that are as easy to use as possible and do not require the memorisation of mnemonics , formats , sequences and complex constructs .
The new generation of computer languages, then , needs to be much more powerful than the previous generation so that results can be obtained much faster.
The first generation of computer languages brought us machine language. In the earliest days of computing , there was no interpreters or compilers to translate computer language from one form to another . Early computers were programmed with a binary notation . For example,
011011 000000 000000 000001 110110
might mean " clear the accumulator and add the contents of storage location 117 to it. "
It was very difficult to program computers in this way without errors. The situation was improved slightly by using mnemonics codes to represent operations. The same instruction might then be
CLA 000000 000000 000001 110101
Later , numbers could be written for storage locations or registers. The instruction might become
CLA 0 0 117
The second generation of computer languages, which came into use in the mid-1950s, produced symbolic assembly languages. Symbolic addresses were used rather than physical machine addresses. For example, our sample instruction might become
where the symbol SCORE represents the location in memory where the variable representing score is stored. Symbolic addressing was a great step forward, because when the physical locations of variables or instructions had to be changed ( and this occurred constantly), the programmer did not have to modify the new physical addresses. Assembly languages had names like SAP (Symbolic Assembly Program, for the IBM 704) , AutoCoder , SPS , BAL, and EasyCoder.
The third generation came into use in the 1960's. These languages were, and still are,
generally referred to as high-level languages. Some of these are , like COBOL,
are used for commercial applications. COBOL has become by far the most commonly used computer
language. Some languages, such as PL/I and later Ada, had facilities for both scientific
and commercial computing.
With the third generation, computer languages have become , to a large extent , independent of the hardware. Programmers can code programs without any knowledge of the machine's instruction set. Because of this hardware independence, program could be converted to run on different machines. Manufacturer-independent standards have been created for third-generation languages; nevertheless , portability still remains a problem.
Third-generation languages moved a step toward the language of the user. They use English-language words and express formulas in mathematical notation. It is easier to write
X = ( A + B ) / ( C * D )
However, third generation languages need vast numbers of lines of code for typical commercial systems and are designed for data processing professionals rather than end users. It is time-consuming to debug a program written in a third-generation language, and the modification of complex systems is very difficult. Many data processing departments become bogged down in complexities and are unable to respond to business needs as quickly as they should.
Fourth-generation languages were created in response to these problems. Their objectives are to
Fourth-generation languages need far fewer lines of code than would be needed with languages like COBOL, PL/I , and Ada to perform the same functions. They might be referred to as high-productivity languages. In addition to employing sequential statements like third-generation languages, they employ a diversity of other mechanisms such as filling in forms or panels, screen interaction, and computer-aided graphics.
Fourth-generation languages vary greatly in their power and capabilities. Some are merely query languages ; some are report generators or graphics packages; some can be used to create complete applications; some are very high level programming languages. Such languages may be employed by end users or by systems analysts who directly aid end users. Whereas third-generation languages can be used to create all or most applications, some fourth-generation languages are designed for only a specific class or range of application. Some are highly restricted in their range; others can handle a wide diversity of applications well. In the fourth generation, much more than in the third , we have to select the language to fit the application.
Fifth Generation Languages, like AppWare, are supposed to be completely point and click. No knowledge of programming is needed to use a 5GL. Of course, the advantage of these 5GL's is their supposed ease of use. Little or no coding provides an easy to use product that can generate new applications.
Therefore basically 4 levels of languages are used:
(the language of the processor). Each computer has its own language, which is a binary code that can be interpreted by the circuitry of the computer.
Symbolic language (Second Generation)
(or assembly language). A more English-like and understandable alternative to machine language. Before it can be used it must be converted to machine language by a special program called an assembler.
Procedure-oriented language (Third Generation)
High level languages. Instructions which are translated into multiple machine-level instructions. Examples are COBOL and C. The programmer specifies procedures, or logic, necessary to accomplish a specific data processing task. COBOL and FORTRAN are examples. These languages are machine independent. Compilers are used to convert P-O languages into machine language.
Fourth generation language (Fourth Generation)
Also known as 'very high level' languages, application-oriented, and user-oriented languages. They closely resemble English. They do not use compilers in the traditional sense and easy to use (such as very English like syntax). Examples are Database Query Language, Report generators, and Statistical and Problem solving applications
The higher the language the closer it is to human speech and the more interpretation the computer software has to do before hardware instructions can be executed.
Properties of existing programming languages :
Can be used by non-technical users to obtain results
Employs a data-base management system directly
Non-procedural code used where possible
Easy to understand and maintain
Subset can be learned by non-technical users in a two-day training course
Designed for easy debugging
A significant advantage of Ada is its reduction of debugging time. Ada tries to catch as many errors as reasonably possible, as early as possible. Many errors are caught at compile-time by Ada that aren't caught or are caught much later by other computer languages. Ada programs also catch many errors at run-time if they can't be caught at compile-time (this checking can be turned off to improve performance if desired). In addition, Ada includes a problem (exception) handling mechanism so that these problems can be dealt with at run-time.
Ada was originally designed for the U.S. Department of Defence (DoD) for real-time embedded systems, and there's a U.S. law mandating Ada's use in DoD software development projects (with various exceptions and waiver provisions). Ada is the most commonly used language in U.S. weapons systems modernisation (more information about the DoD use of Ada is available).
However, Ada's user base has expanded far beyond the U.S. DoD to many other areas such as large-scale information systems, distributed systems, and scientific computation. Major Ada niches include aerospace and safety-critical systems. An informal 1994 survey concluded that Ada was the most popular language for safety-critical systems.
People use Ada for small projects as well as large ones, since Ada's error-catching capabilities (both compile-time and run-time) significantly reduce debugging time. Also, Ada's parallel constructs can take advantage of today's more advanced operating systems (such as Microsoft's Windows NT, Windows 95, and Mach).
Many people use Ada when the application must run quickly. The Ada programming language was designed to be efficiently implementable, since one of its key application domains is in real-time embedded systems (where efficiency is critical). The actual efficiency of an Ada program, of course, depends on the algorithms selected and the actual Ada compiler used. The first Ada compilers, like many other first compilers of a given language, generated inefficient code; modern Ada compilers generally generate relatively good code. Sadly, the performance of the initial Ada compilers created a myth of slow execution that is only beginning to disappear. The best test of efficiency, of course, is to benchmark a specific compiler with the type of problem you wish to solve.
Ada contains features commonly found in other programming languages and provides additional support for modern programming practices, for controlling special purpose hardware to meet real-time deadlines, and for the creation and enhancement of large and complex programs by groups of programmers over long periods of time.
Ada encourages good programming practices by incorporating software engineering principles with strong typing, modularity, portability, reusability and readability. These features reduce costs in software development, verifying, debugging, and maintenance that typically puts strain on an organisation's resources over the life of the software.
The year 1974, a special division of the United States of America Department of Defence (DoD) realised having software for embedded computers developed and maintained in hundreds of programming languages was excessively costly. A cost study was carried out and requirements for a high level language to be used in all time didn't meet the requirements and so it was decided a new language should be designed. Design teams from all over the world and commissioned by DoD to develop four prototype languages with requirements given in 1977. Only one succeeded: Green, in May 1979 designed by Honeywell/Bull team located in France.
This was the birth of Ada, named in honour of the first programmer in history, Lady Ada Lovelace. The first reference manual, Military Standard (MIL-STD-1815) was published in 1980. The interest of making Ada an international standard arose ironically early 1979, one year before it became a military standard. An example of a language being developed to become a standard and not becoming a standard because of its popularity. In 1980, ANSI responded to this standard as a member body representing US to the ISO, by submitting a new work item to the ISO Technical Committee (ISO/TC 97) using the first reference manual as the proposed standard. A procedural difficulty led to another ballot being sent out in September 1980 and remained open until May 1981 and finally assigned Ada to Sub-Committee 5 (ISO/TC97/SC5). Three years passed, 1983, a new military reference manual (MIL- STD-1815A) came out in January and the ANSI standard released in February (ANSI/ MIL-STD-1815A-1983).
The Ada experts group expanded and was recognised within ISO as Working Group 14 (ISO/TC97/SC5/WG14) and the Ada reference manual (ANSI/ MIL-STD-1815A- 1983) was registered as document N759. Voting ballots were sent out by SC 5 for it to be registered as a Draft Standard in July 1984. This later on became ISO Draft Proposal 8652(DP8652). ISO/TC97 was undergoing internal structural changes and the SC 5 sub-committee assigned, was divided to SC21- Information Retrieval, Transfer and Management of OSI and SC22- Programming Languages and Applications Environments. Ada was assigned to the latter SC and was renumbered as ISO/TC97/SC22/WG9.
The Teenage Years
January 1985 saw a letter sent out for the balloting of the registration of DP8652 as a Draft International Standard(DIS) and by January 1986, ISO/DIS 8652 Programming Language-Ada was scheduled for circulation within the TC97 body members. A separate Working Group, WG9, was assigned to propose the Ada Programming Language Reference Manual to the ISO community and assist it through to become an International Standard. Results in a meeting held in Brussels, it was resolved that the resulting interpretations of the standard would be passed from WG14 to WG9. In spring of 1985, the Ada Joint Program Office(AJPO) announced North Atlantic Treaty Organisation's (NATO) adoption of Ada as its common high-order language in all military systems by the beginning of 1986. On March 12 1987, the CS of the ISO unanimously approved the DIS 8652 and registered ISO/8652:1987, Programming Language - Ada as an International Standard.
As a standard procedure carried out by ISO to review a standard every five to ten years to either abandon, reaffirm as they are or update it, after some years of use and implementations, a decision was made in 1988 to undertake a revision of Ada. ANSI was asked to sponsor the revision. ANSI worked very closely with the US DoD and AJPO to the developments of the revised Ada Standard, known as Ada 9X. Close consultation with ISO was important to ensure the needs of the whole world wide Ada community were taken into account and to ensure the timely adoption by ISO of the new standard. The Ada 9X effort, led by Tucker Taft, was required to reflect current essential requirements to include large scale information and distributed systems, scientific computation and systems programming for real time and embedded systems Three main phases was required, to determine the requirements for the revised language, the actual development of the definition for the revised language and the transition of use of the Ada 83 to Ada 9X. As of December 1994, Ada 95 became the first standard object oriented programming language by ANSI/ISO standards.
6. Some new style , but obscure and
non-standard programming languages
BASIC, FORTRAN, COBOL ... these programming languages are well known and (more or less) well loved throughout the computer industry. There are numerous other languages, however, that are less known, yet still have ardent devotees. In fact, these little-known languages have the most fanatic admirers. For those who wish to know more about these obscure languages -- and why they are obscure -- the following catalogue is presented.
SLOBOL Best known for the speed of it's compiler. Although many compilers allow you to take a coffee break, SLOBOL compilers allow you to travel to Bolivia to pick the coffee. Forty-three programmers are known to have died of boredom sitting at their terminals while waiting for a SLOBOL program to compile.
VALGOL From it's modest beginnings in southern California's San Fernando Valley, VALGOL is enjoying a dramatic surge of popularity across the industry. VALGOL commands include REALLY, LIKE, WELL, and Y'KNOW. Variables are assigned with the =LIKE and =TOTALLY operators. Other operators include the "California Booleans" -- FERSURE and NO WAY. Repetitions are handled in FOR-SURE loops. Here is a sample VALGOL program:
14 LIKE Y'KNOW (I MEAN) START
%% IF PI A =LIKE BITCHIN AND
01 B =LIKE TUBULAR AND
_9 C =LIKE GRODY**MAX
4 K (FERSHURE)**2
4i FOR I =LIKE 1 TO OH MAYBE 100
86 DO WAY + (DITTY**2)
9 BARF(1) =TOTALLY GROSS (OUT)
IF LIKE BAG THIS PROGRAM
$$ LIKE TOTALLY (Y'KNOW) VALGOL is characterised by it's unfriendly error messages. For example, when the user makes a syntax error, the interpreter displays GAG ME WITH A SPOON!
LAIDBACK Historically related but prior to VALGOL. LAIDBACK was developed at the (now defunct) Marin County Center for T'ai Chi, Mellowness, and Computer Programming, an alternative to the more intense atmosphere in nearby Silicon Valley. The center was ideal for programmers who liked to soak in hot tubs while they worked. Unfortunately, few programmers could survive there for long, since the center outlawed pizza and RC Cola in favour of bean curd and Perrier. Many mourn the demise of LAIDBACK because of it's reputation as a gentle and non-threatening language. For example, LAIDBACK responded to syntax errors with the message, SORRY MAN, I CAN'T DEAL WITH THAT.
SARTE Named after the late existential philosopher, SARTE is an extremely unstructured language. Statements in SARTE have no purpose; they just are. Thus, SARTE programs are left to define their own functions. SARTE programmers tend to be depressed and are no fun at parties.
C- This language was named for the grade received by it's creator when he submitted it as a class project. C- is best described as a "low-level" programming language. In fact, the language generally requires more C- statements than machine code statements to execute a given task. In this respect it is very similar to COBOL.
DOGO Developed at the Massachusetts Institute for Obedience Training. DOGO heralds a new era of computer-literate pets. DOGO commands include SIT, STAY, HEEL, and ROLL-OVER. An innovative feature of DOGO is "puppy graphics," a small cocker spaniel that occasionally leaves a deposit as he travels across the screen.
An otherwise unremarkable language distinguished by the absence of an "s" in it's character set. Users must substitute "th". LITHP ith thaid to be utheful in proceththing lithtth.
If you thought that COBOL was a language of the past, think again. Each year, an estimate of five billions lines of code is added and used in mission critical business applications worldwide. A draft standard is being developed jointly by the ISO and ANSI's ASC X3 (Accreditted Standards Committee X3) with a target date of release of the standard set for 1997. The new Standard will incorporate the basic object oriented programming capabilities like inheritance and polymorphism. The existing standard, released in 1985 (ANSI X3.23-1985) was widely accepted and is still very powerful as it conforms to the Federal Information Processing Standard for Cobol (FIPS 21-4).
The schedule for the new standard depends on the changes vendors and users request during the review process. The draft standard underwent informal public review in spring 1995, the resulting comments for incorporation into the draft are being considered and will undergo formal public review in early 1996.
Java: A simple, object-oriented, distributed, interpreted, robust, secure, architecture neutral, portable, high-performance, multithreaded, and dynamic language.One way to characterise a system is with a set of buzzwords. We use a standard set of them in describing Java. The rest of this section is an explanation of what we mean by those buzzwords and the problems that they were trying to solve.
They wanted to build a system that could be programmed easily without a lot of esoteric training and which leveraged today's standard practice. Most programmers working these days use C, and most programmers doing object-oriented programming use C++. So even though we found that C++ was unsuitable, we designed Java as closely to C++ as possible in order to make the system more comprehensible.
Java omits many rarely used, poorly understood, confusing features of C++ that in our experience bring more grief than benefit. These omitted features primarily consist of operator overloading (although the Java language does have method overloading), multiple inheritance, and extensive automatic coercion's.
We added auto garbage collection thereby simplifying the task of Java programming but making the system somewhat more complicated. A good example of a common source of complexity in many C and C++ applications is storage management: the allocation and freeing of memory. By virtue of having automatic garbage collection the Java language not only makes the programming task easier, it also dramatically cuts down on bugs.
Another aspect of being simple is being small. One of the goals of Java is to enable the construction of software that can run stand-alone in small machines. The size of the basic interpreter and class support is about 40K bytes; adding the basic standard libraries and thread support (essentially a self-contained microkernel) adds an additional 175K.
This is, unfortunately, one of the most overused buzzwords in the industry. But object-oriented design is very powerful because it facilitates the clean definition of interfaces and makes it possible to provide reusable "software ICs."
Simply stated, object-oriented design is a technique that focuses design on the data (=objects) and on the interfaces to it. To make an analogy with carpentry, an "object-oriented" carpenter would be mostly concerned with the chair he was building, and secondarily with the tools used to make it; a "non-object-oriented" carpenter would think primarily of his tools. Object-oriented design is also the mechanism for defining how modules "plug and play." The object-oriented facilities of Java are essentially those of C++, with extensions from Objective C for more dynamic method resolution. In fact, as we speak, object oriented Ada 95 is currently being introduced into Java to develop Java programs (applets and applications).
Java has an extensive library of routines for coping easily with TCP/IP protocols like HTTP and FTP. Java applications can open and access objects across the net via URLs with the same ease that programmers are used to when accessing a local file system.
Java is intended for writing programs that must be reliable in a variety of ways. Java puts a lot of emphasis on early checking for possible problems, later dynamic (runtime) checking, and eliminating situations that are error prone just like in Ada.
One of the advantages of a strongly typed language (like C++) is that it allows extensive compile-time checking so bugs can be found early. Unfortunately, C++ inherits a number of loopholes in compile-time checking from C, which is relatively lax (particularly method/procedure declarations). In Java, we require declarations and do not support C-style implicit declarations.
The linker understands the type system and repeats many of the type checks done by the compiler to guard against version mismatch problems.
The single biggest difference between Java and C/C++ is that Java has a pointer model that eliminates thepossibility of overwriting memory and corrupting data. Instead of pointer arithmetic, Java has true arrays. This allows subscript checking to be performed. In addition, it is not possible to turn an arbitrary integer into a pointer by casting.
Very dynamic languages like Lisp, TCL and Smalltalk are often used for prototyping. One of the reasons for their success at this is that they are very robust: you don't have to worry about freeing or corrupting memory. Programmers can be relatively fearless about dealing with memory because they don't have to worry about it getting corrupted. Java has this property and it has been found to be very liberating.
One reason that dynamic languages are good for prototyping is that they don't require you to pin down decisions early on. Java has exactly the opposite property; it forces you to make choices explicitly. Along with these choices come a lot of assistance: you can write method invocations and if you get something wrong, you are informed about it at compile time. You don't have to worry about method invocation error. You can also get a lot of flexibility by using interfaces instead of classes.
Java is intended to be used in networked/distributed environments. Toward that end, a lot of emphasis has been placed on security. Java enables the construction of virus-free, tamper-free systems. The authentication techniques are based on public-key encryption.
There is a strong interplay between "robust" and "secure." For example, the changes to the semantics of pointers make it impossible for applications to forge access to data structures or to access private data in objects that they do have access to. This closes the door on most activities of viruses.
Java was designed to support applications on networks. In general, networks are composed of a variety of systems with a variety of CPU and operating system architectures. To enable a Java application to executeanywhere on the network, the compiler generates an architecture neutral object file format -- the compiled code is executable on many processors, given the presence of the Java runtime system.
This is useful not only for networks but also for single system software distribution. In the present personal computer market, application writers have to produce versions of their application that are compatible with the IBM PC and with the Apple Macintosh. With the PC market (through Windows/NT) diversifying into many CPU architectures, and Apple moving off the 68000 towards the PowerPC, this makes the production of software that runs on all platforms almost impossible. With Java, the same version of the application runs on all platforms. The Java compiler does this by generating bytecode instructions which have nothing to do with a particular computer architecture. Rather, they are designed to be both easy to interpret on any machine and easily translated into native machine code on the fly.
Being architecture neutral is a big chunk of being portable, but there's more to it than that. Unlike C and C++, there are no "implementation dependent" aspects of the specification. The sizes of the primitive data types are specified, as is the behaviour of arithmetic on them. For example, "int" always means a signed two's complement 32 bit integer, and "float" always means a 32-bit IEEE 754 floating point number. Making these choices is feasible in this day and age because essentially all interesting CPU's share these characteristics.
The libraries that are a part of the system define portable interfaces. For example, there is an abstractWindow class and implementations of it for Unix, Windows and the Macintosh.*1
The Java system itself is quite portable. The new compiler is written in Java and the runtime is written in ANSI C with a clean portability boundary. The portability boundary is essentially POSIX.
The Java interpreter can execute Java bytecodes directly on any machine to which the interpreter has been ported. And since linking is a more incremental and lightweight process, the development process can be much more rapid and exploratory. As a part of the bytecode stream, more compile-time information is carried over and available at runtime. This is what the linker's type checks are based on, and what the RPC protocol derivation is based on. It also makes programs more amenable to debugging.
While the performance of interpreted bytecodes is usually more than adequate, there are situations where higher performance is required. The bytecodes can be translated on the fly (at runtime) into machine code for the particular CPU the application is running on. For those accustomed to the normal design of a compiler and dynamic loader, this is somewhat like putting the final machine code generator in the dynamic loader.
The bytecode format was designed with generating machine codes in mind, so the actual process of generating machine code is generally simple. Reasonably good code is produced: it does automatic registerallocation and the compiler does some optimisation when it produces the bytecodes.
In interpreted code we're getting about 300,000 method calls per second on an Sun Microsystems SPARCStation 10. The performance of bytecodes converted to machine code is almost indistinguishable from native C or C++.
There are many things going on at the same time in the world around us. Multithreading is a way of building applications with multiple threads,*2Unfortunately, writing program s that deal with many things happening at once can be much more difficult than writing in the conventional single-threaded C and C++ style.
Java has a sophisticated set of synchronisation primitives that are based on the widely used monitor and condition variable paradigm that was introduced by C.A.R.Hoare. *3By integrating these concepts into the language they become much easier to use and are more robust. Much of the style of this integration came from Xerox's Cedar/Mesa system.
Other benefits of multithreading are better interactive responsiveness and real-time behaviour. This is limited, however, by the underlying platform: stand-alone Java runtime environments have good real-time behaviour. Running on top of other systems like Unix, Windows, the Macintosh, or Windows NT limits the real-time responsiveness to that of the underlying system.
In a number of ways, Java is a more dynamic language than C or C++. It was designed to adapt to an evolving environment.
For example, one major problem with using C++ in a production environment is a side-effect of the way that code is always implemented. If company A produces a class library (a library of plug and play components) and company B buys it and uses it in their product, then if A changes it's library and distributes a new release, B will almost certainly have to recompile and redistribute their own software. In an environment where the end user gets A and B's software independently (say A is an OS vendor and B is an application vendor) problems can result.
In short, if A distributes an upgrade to its libraries then all of the software from B will break. It is possible to avoid this problem in C++, but it is extraordinarily difficult and it effectively means not using any of the language's OO features directly.
By making these interconnections between modules later, Java completely avoids these problems and makes the use of the object-oriented paradigm much more straightforward. Libraries can freely add new methods and instance variables without any effect on their clients.
Java understands interfaces - a concept borrowed from Objective C which is similar to a class. An interface is simply a specification of a set of methods that an object responds to. It does not include any instance variables or implementations. Interfaces can be multiply-inherited (unlike classes) and they canbe used in a more flexible way than the usual rigid class inheritance structure.
Classes have a runtime representation: there is a class named Class, instances of which contain runtime class definitions. If, in a C or C++ program, you have a pointer to an object but you don't know what type of object it is, there is no way to find out. However, in Java, finding out based on the runtime type information is straightforward. Because casts are checked at both compile-time and runtime, you can trust a cast in Java On the other hand in C and C++, the compiler just trusts that you're doing the rightthing.
It is also possible to look up the definition of a class given a string containing its name. This means that you can compute a data type name and have it easily dynamically-linked into the running system.
The Java language provides a powerful addition to the tools that programmers have at their disposal.
Java makes programming easier because it is object-oriented and has automatic garbage collection. In
addition, because compiled Java code is architecture-neutral, Java applications are ideal for a diverse
environment like the Internet.
A recent approach to 3GL programming that has important implications in CBL development has been the introduction of visual programming languages (of which the best known and most important by far is Microsoft's Visual Basic for Windows, although there are other similar systems on various platforms). With this approach conventional programming is combined with form based authoring techniques to provide a powerful and accessible development environment.
To write a Visual Basic (VB) application the front end is first constructed by drawing it directly on the screen with a form editor, and BASIC code is then assigned to any event (e.g. a button click or menu selection). VB is provided with a library of standard control objects, and there is an extensive and rapidly growing range of support and expansion products produced by Microsoft and others. VB is an event driven and partially object-oriented language (i.e. it has encapsulation and a single level of inheritance, but it is not possible to derive a hierarchy of descendant types from an object). In its latest incarnation (version 3.0) the library of controls contains many facilities of interest to CBL developers, including excellent graphic and multimedia capabilities and very sophisticated database functionality. The professional edition of VB is similar to the standard edition, but there are many more controls and the database capability is significantly enhanced.
For many purposes VB must be a strong candidate for the authoring system of choice, provided that Windows is the only delivery platform (there is nothing remotely like it for the Mac). Another problem is that VB is an interpreter rather than a compiler (although it does produce a standalone program, and the runtime files that are required are freely distributable). Effectively this means that VB applications are relatively slow and it cannot produce DLL's. It is rumoured that the next version of VB will contain a compiler.
Because VB cannot produce DLL's (although it can make use of them) if it is used then it must be the main development tool for an application and it cannot easily be used to extend the functionality of another authoring system. VB is however a flexible development tool that allows developers with some knowledge of BASIC but little technical understanding of Windows to quickly and relatively easily write sophisticated Windows applications that can make use of even the most complex of Windows facilities (such as Dynamic Data Exchange (DDE) and Object Linking and Embedding (OLE)).
Update from September 1995|
There have been two new developments worth mentioning. Recently Microsoft have released a new version of Visual Basic (version 4). This offers many new features - including improved performance, a much closer integration of OLE and the ability to generate either 16 or 32 bit applications (for Windows 3.x, Windows 95 or Windows NT).
The other major development to occur in this area since originally writing this section is that Borland have released a large and very sophisticated development environment called Delphi. This is an implementation of the Turbo Pascal language combined with a VB-like visual development environment (it can effectively be thought of as "Visual Turbo Pascal"). Unlike VB this system is completely object-oriented, and it is provided with a very large class library - some of which is available in Delphi source code. Also unlike VB Delphi is a true compiler which can create DLL's as well as complete applications. Delphi has considerable potential for developers who would like to combine most of the ease of use of VB with much of the flexibility of C.
7.2 Future Programming Language Concepts
Learning is incremental, new concepts can be used both to process language and to
participate in the learning of further concepts. There are six developments in software
technology that will play major roles for the future technics of programming
languages. They are;
i: Logic Programming
ii: Object Oriented Programming
iii: Exploratory Programming Environments
iv: Natural Language Programming (NL)
v: Functional Programming
vi: Automatic Programming
i: Logic Programming
Prolog (Programming in Logic) Prolog was invented in 1971 by Alain Colmerauer from the AI unit of University of Marseilles based on first order symbolic logic. This lets the computer draw inferences from a series of declarative true or false statements. This powerful language has been regarded as a stepping stone for developing kernel language, well suited to pattern matching, list processing and a variety of other data handling chores. Prolog allows knowledge to be processed in parallel as well as sequentially thus allowing fast computation methods. The original Prolog does not deal with abstract data but this feature is being added together with high order extensions to the fifth generation version being extensively researched by groups in Europe and Japan.
ii: Object Oriented Programming
Object oriented programming which forms the basis for languages like Turing, C++, Ada and Smalltalk takes yet another untraditional view of the programming problem. Rather than being built as a collection of procedures and subroutines, object oriented systems are built up as a collection of data objects, each one of which knows how to respond to a set of commands that can be given to it by organising it into classes to make use of inheritance. Systems organised in this manner become easy to extend or modify by simply specialising existing objects to form new applications. This type was explained in the previous section.
iii: Exploratory Programming
This is a development style that allows the incremental developments of complex applications that are too difficult or uncertain to prespecify completely. Instead of requiring a complete, detailed design at the beginning of program construction, this method allows and encourages the programmer to develop several potential designs by building program fragments and exploring how they work. It allows the flexibility to understand enough about programming to help them to make, understand and control the many changes that will be made to the rapidly developing program.
iv: Natural Language (NL)
Very much of information in general is gathered and expressible in English. Because NL can enable computers to interact with users of ordinary language, it can make computer power available to consumers that are unable or unwilling to learn a new language. NL can help increase knowledge productivity by providing a mechanical means for manipulating knowledge that is now expressed in natural language as in encyclopaedias, web pages, files, large databases, manuals and reports. With potential parallelism at hand, we can consider algorithms for highly parallel word sense selection, concurrent syntactic, speech recognition, semantic and pragmatic evaluation of sentences. We will only see solid developments in this area in about ten years time.
v: Functional Programming
VLSI is currently not being exploited fully and recognition of this fact has led to focus on functional programming which offers the prospect of much cheaper programs and new machine architectures that will exploit this. One example language of functional language is LISP, retaining features of the Von Neumann programming technique. The second type is called the function-level programming where existing programs are put together with program-forming operations to form new programs and reused to build even larger ones. This will allow parallel operations to be expressed easily well suited to VLSI technology.
vi: Automatic Programming
Twenty years ago, Fortran and other high level languages were developed to enable programmers to generate many machine language commands from fewer high level instructions. Today's efforts are aimed at producing languages that will allow simple statement about the purpose or behaviour of a program to generate the entire high level language algorithms to carry out the operations thus eliminating the need for applications programmers. So much of the work of programming involves deciding which order instructions are executed, data flow languages (an idea originating 15 years ago) will automate these decisions and reduce the difficulty of programming.
It is hard to say where we will go from here but rapid developments and extensive research, though regarded not enough by professionals, coupled with consumer demands and influence towards developments of International Standards will decide what the future of Programming Languages and its concepts will be.
8. The Finale
Where do we go from here?
As we approach the 21st century, with major advances evident in every single area of the information technology field, microelectronics, semiconductors, networked databases, communications and many more, the need for sophisticated programming languages is clearly an important factor in assisting these developments. By the end of this decade, conventional systems should be able to execute on the order of 100 million sequential instructions per second and access gigabytes of memory possibly achieved using fast parallel architecture and let machines handle speech, graphics and images so they can interact with humans flexibly and smoothly.
Another advancement would probably take place inside the computer systems itself as it takes over more of the task of managing its data, internal structures and resources. We would probably see concurrent languages, functional programming, symbolic processing handling natural languages, vision and speech recognition. Conversely we still know very little about multiprocessor architectures, concurrent programming or parallelism. Our understanding of task decomposition strategies is still limited and current languages, both natural and computer based are inadequate to represent concurrency. Elegant ways to express the above would emerge in the future partnered with intelligent systems, like in AI(Artificial Intelligence) with capability to solve problems of deep reasoning and understanding, together with communications networks connecting people and machine.
Things like very high programming languages and hardware specifications, integrated software development environments, rudimentary silicon compilers (program that map directly from high level description of chip function to layout), parallel processors for circuit analysis and VLSI design, high performance work stations, data-flow languages working with data-flow machines, parallel programming languages, compilers and operating systems will possibly the instruments to shape the future of computer hardware and software.
Writing, testing and debugging software is a very lengthy labour intensive process. The next generation will allow automated maintenance, aids for language developments and improved programming environments. Automatic programming, a process of automatically generating a program given formal specifications of inputs and outputs, is one of the candidates for future types of programming languages. Data-flow programming languages like VAL, Prolog (Programming in Logic) combined with data-flow machines will make parallel processing, aggressively exploiting VLSI technology, to generate a program from libraries of standard program modules without the need of the programmer keeping track of interaction between processors is also another candidate. Alternatively, the developments of intelligent software's will allow machines to take over the burden of programming by automatically converting problems into efficient computer programs. Who knows? The future is out there.
We would like to sincerely thank Dr. T. Clarke for his excellent guidance, patience, motivation and understanding throughout this Surprise 96 project, whom without, we could not have imagined to have done it as it is now. We started this report clueless and ended knowing more than we would have expected. Though it might be long, we felt that the contents of this report is important and needed to be mentioned. The amount of material we came across has educated us in many ways and we hope, it will be just as educational to you to have a glimpse of what the future might need of usprogrammers. Our heartfelt thanks also goes out to the Surprise 96 co-ordinators, our other hard working friends in ISE 2 who went through the sleepless nights and stress that we did, not forgetting our social reps!! It has been totally enjoyable and we definitely think the experience gained is something that we will confidentally carry out to our future occupation.
Mathew Kwan and Mohamad Johan Nasir
Next-generation Computers by Edward A. Torrero
"Automatic generation of computer programs" Advances in Computers, Vol 17, Academic Press Inc. 1978, pp57-123
"Can programming be liberated from the von Neumann Style? A functional style and it's algebra of programs" by John Backus, Communications of the ACM, August 1978 pp 613-641
"Functional-level programs as mathematical objects" Proceedings of the conference of Functional programming languages and computer architecture , ACM Oct 1981, pp 1-10
Programming Language Choice Practice and Experience by Mark Woodman
More Effective C++ by Scott Meyers
C++ Complete by Anthony Rudd
American National Standard for Information Systems - Programming language - C , ANSI X3.159-1989
A Comparison of ADA and C++, TRI-ADA'92 Proceedings pp338-349 , Engle, C.B.,Jr.
Dr Dobb's Journal August 1995 C/C++ Programming pp28 and pp 56
Rationale For The ANSI C Programming Language, Silicon Press , 1994 Imperial College Lib. (CCC 4.22C RAT)
International Organization for Standardization, Standardization And Documentation, 1983 . Imperial College Lib (389.6 INT)
Self Assessment procedure XXIII: Programming Languages, Clifton, M.H.,Comm. of the ACM, Journal , Vol 38 Issue 5, pp89-96 , May 1995
Dr Dobb's Journal Scripting Languages pp 36 and pp 52
Emerging Hypermedia Standards: Hypermedia Marketplace Prepares for Hytime & MHEG NTIS no: PB92-120328/HDM 1991.
Language Design For Program Manipulation. IEEE Transactions on Software Eng SE18 1992 Vol 18, no 1 Jan 1992 pp 19
IEEE Software MSOFT-7 MSOFT-8 i) A Dynamic C Based OO System for UNIX May 1991 pp73-85 ii) Fplus. A programming Environment for Scientific Applications. 1991 pp81-92
History of Programming Languages, Richard L. Wexelblat (ed.), Academic Press 1981.
Fourth Generation Languages Volume 1: Principles, by James Martin and Joe Leben, Prentice Hall. 1986
High Level Languages and Their Compilers, Des Watson, Addison-Wesley, 1989.
Programming Language Choice Practice and Experience by Mark Woodman
Programming Language Critiques: Pascal , C ,C++ ,and C-Linda by Jim Baseny May 1995
The Programming Language Oberon - Make it as simple as possible , but not simpler by A.Einstein
Press Release 15th October ,1992 Fourth Generation Programming Language for Engineers and Scientists by email@example.com
Smalltalk, C++ , and OO Cobol : The Good , the Bad and the Ugly by 1995 Jeff Sutherland and Object Magzine