10 July 2007

Interpreters

Over a year ago I posted about integrated development environments (IDE's), and mentioned in passing that interpreters were
An "interpreter" is a program that executes another program line by line, rather than translating it into assembly or machine code. When an interpreter runs a program, it responds to errors differently than would a computer running the compiled version of the same program. Since the compiled program is merely a translation, an error would potentially cause the computer running the program to crash. An interpreter can stop and send an error message.
That was, of course, profoundly inadequate. I'd like to post just a little bit more about what interpreters do.

Strictly speaking, any programming language may be implemented with an interpreter or a compiler; although some exceptions may apply. Disagreement exists over whether Java is an interpreted language or a compiled language, with some (Lowe, p.11) saying it is compiled, and others (Cohn, et al.) saying it is interpreted. Perl may be implemented with a compiler; so may PHP; Lisp; and Python. I don't pretend to be an authority on the subject, but there's a basic concept in logic that the statement "All x is y" is false if even a single example of xy can be found. I am vulnerable here on the grounds that disagreement may exist over whether a thing is a compiler or an interpreter.

There are several reasons why a language may be implemented with an interpreter rather than a compiler. First, the obvious reason is that you may want a developer environment that allows you to debug the program. With a compiler, you simply get a message, "Runtime error." It might tell you more, but a really sophisticated interpreter can help you find the actual spot where the error occurred, and even help correct your spelling. Since the compiler's function is to translate the entire program into byte code that the machine can read "in one shot" as it were, debugging with a true compiler is a little like finding a needle in a haystack.

Another reason is that an interpreter may be easier to update and be creative with. Ruby was developed with certain syntax innovations ("principle of least surprise"—POLS), and of course it was a lot easier to create an interpreter that could run an increasing number of commands, than a compiler with a fully-revised complement of libraries, ported to that specific model of microprocessor. Also, a compiler generates machine code, or data in the ones and zeros that the microprocessor actually understands. In contrast, an interpreter can be written entirely in a high-level programming language like C, without any knowledge of machine code.
________________________________________________
How do Interpreters/Compilers Work?

There are several similarities between compilers and interpreters at the operational level. The code that is sent to the compiler/interpreter for execution is called the source file; sometimes, programs written explicitly for use with an interpreter are called scripts. Both interpreters and compilers include a scanner and lexer. The scanner module reads the source file one character at a time. The lexer module divides the source file into tiny chunks of one or more characters, called tokens, and specifies the token type to which they belong; the effect is rather like diagramming a sentence. Suppose the source file is as follows
cx = cy + 324;
print "value of cx is ", cx;
The lexer would produce this:
cx  --> Identifier (variable)
= --> Symbol (assignment operator)
cy --> Identifier (variable)
+ --> Symbol (addition operator)
324 --> Numeric constant (integer)
; --> Symbol (end of statement)
print --> Identifier (keyword)
"value of cx is " --> String constant
, --> Symbol (string concatenation operator)
cx --> Identifier (variable)
; --> Symbol (end of statement)
The ability of the lexer to do this depends on the ability of the scanner to document exactly where each token occurs in the source filer, and its ability to scan backwards and forwards. Sometimes the precise meaning of the file depends on its position with respect to other token types. For example, operators may contain more than a single character (e.g., < as opposed to <=). The lexer may have to pass a message to the scanner to back up and check to see the identity of neighboring characters.

The parser receives the tokens + token types from the lexer and applies the syntax of the language. The parser actually requests the tokens and assesses their appropriateness with respect to the syntax of the language, and sometimes demands additional information from the lexer module.
Parser: Give me the next token
Lexer: Next token is "cx" which is a variable.
Parser: Ok, I have "cx" as a declared integer variable. Give me next token
Lexer: Next token is "=", the assignment operator.
Parser: Ok, the program wants me to assign something to "cx". Next token
Lexer: The next token is "cy" which is a variable.
Parser: Ok, I know "cy" is an integer variable. Next token please
Lexer: The next token is '+', which is an addition operator.
Parser: Ok, so I need to add something to the value in "cy". Next token please.
Lexer: The next token is "324", which is an integer.
Parser: Ok, both "cy" and "324" are integers, so I can add them. Next token please:
Lexer: The next token is ";" which is end of statement.
Parser: Ok, I will evaluate "cy + 324" and get the answer
Parser: I'll take the answer from "cy + 324" and assign it to "cx"
Indents are used here to indicate a subroutine. This illustrates what the interpreter/compiler must do in order to add cy and 324. If the parser gets a token that violates the syntax, it will stop processing and send an error message.

The next module is the Interpreter or, with compilers, the Code Generator, which actually executes the code. With interpreters (as opposed to compilers), this is sometimes part of the parser; the parser interprets and converts the statements into bytecode (i.e., intermediate language, passed off to a compiler). In the case of the compiler itself, the code generate produces machine code that can be executed by the microprocessor.

(Special thanks to Scorpions4ever)
SOURCES & ADDITIONAL READING: Wikipedia, Interpreter (computing), Interpreted Language;

BOOKS: Doug Lowe, Java for Dummies, Wiley Publishing (2005); Cohn, et al., Java Developer's Reference (1996)

Labels: , , , ,

05 July 2007

The FTC & net neutrality

(Series on Communications Law, USA)

I was alerted by a series of posts that alleged that the Federal Trade Commission (FTC) had abandoned net neutrality. This is not exactly accurate, and I'd like to start over and explain to readers (in brief) the meaning and status of net neutrality. Top FTC officials are hostile to the concept of net neutrality, since the FTC regards its primary mission to devise and promote legal standards that favor export revenue. Should Internet neutrality be enshrined into law, the FTC believes it will be undermining the telecommunications sector's ability to capture rents from its infrastructure. This issue has come up during a period where the FTC is entirely beholden to the telecoms, and it has continuously advertised its position that it believes what is good for AT&T is not only good for America, but the epiphany of justice as well.

However, as an arbiter, the FTC needs to validate its position in trade law; it may not rule by whim. Hence, it has published endless "studies," which amount to editorials that might have been written by attorneys for the major telecom firms. It is Congress that must decide.

The Internet uses a system of packet switching to transmit very large amounts of digital information over existing telephone, coaxial cable, and DSL lines. In the past, when people used telephone lines solely to communicate orally, the effect was analogous to a train, which occupied the track (so to speak) for the entire duration of the phone conversation. If we pretend that all roads consist of one lane, and that they are interchangeable with railroads (so that, for example, it were possible for trains to use—and tie up—the highway for half an hour at a stretch), then the analogy is nearly perfect. Only one train headed for one destination may occupy one track at a given time. This is compatible with the immense loads that trains—or telephone conversations—carry. A 100-car train may carry about ten thousand tons of freight; a telephone conversation, a continuous stream of rich audio data. In contrast, digital transmissions need only communicate a finite string of bits. This is equivalent to thousands of little Vespas buzzing onto and off of the highway. Even when an internet connection is active, its connection to the server can be analogized to an intermittent traffic of a few hundred scooters. Naturally, other scooters can fit in between them with ease. During the course of an internet session of (say) two hours, the volume of data transmitted may well be equivalent to a telephone conversation of one or two minutes. That means that several scores of internet connections may have the same telephone load as a single telephone call.

Of course, the TCP/IP protocol makes this possible by arranging the data into packets of fixed duration, which then flow through like cars through a busy city center. The Internet Protocol acts like a system of traffic codes and signals that coordinate the packets so that they flow smoothly. The interesting thing about this is that, with improvements in data compression technologies, it is (ironically) possible to use the Internet to transmit audio files as bit packets, more efficiently than as analogue streams—the thing that telephone lines were created to do. Another point to bear in mind is that the Internet and the IP protocol have evolved over time so that most traffic now occurs over broadband connections, in which data is transmitted hundreds of times more rapidly. The physical constraints of 1990's-era telephone lines have been superseded by ethernet and coaxial cable, but this merely means that the load of potential data that can be transmitted has physically increased, without a drastic shift in the prevailing rules.

Now, in the past the IP system has been utterly, relentlessly neutral. The analogy to cars moving through a vastly busier traffic network, with vastly increased capacity, still holds. Stoplights don't award faster access to the cars of wiser and busier people, at the expense of cruisers and idle wastrels. In fact, the physics of vehicular traffic is somewhat different from that of internet connections; so in my TRON-like universe, packets marked "Priority A" are awarded with closer spacing (and hence, greater volume) than packets marked "Priority B."

The vast majority of internet connections are provided by telephone companies, which created the system of "pipes" based on the presumed mixture of conventional telephone calls and broadband internet connections. The premium on telephone service is so huge that it makes telephone monopolies immensely profitable; from a financial/business perspective, internet service is just a way of getting additional revenue at little marginal cost. The problem is that services such as voice-over-internet protocol (VoIP) would mean that the main revenue stream for telephone companies would be cut off. Instead, people would use the phone company for internet connections, if that. A huge number of Usonians already get their phone service through their wireless service anyway. With technologies like Bluetooth and Wi-Fi, a Skype subscriber can actually use her cell phone at work or at home, and get VoIP service as if she were communicating through a headset plugged into her computer. That would effectively cut out the PCS companies as well as the landline telephone companies.

One way of getting their money back is for telephone companies to use a strategy known in basic economics as "discriminatory pricing." This means that people using the Internet more heavily (as, for example, those who get movies or VoIP through it) would pay a premium, but get better service. Better service, in this case, would mean VoIP packets would get a higher priority.

It's interesting to note that a mass migration of telephony and television services to IP channels of delivery would lead to a new business model under which broadcasters networks, cable TV providers, and telephone companies would all become virtually indistinguishable businesses, all providing a mixture of IP-related services. In the same way that all financial services in the United States were allowed to merge into nationwide supermarkets of finance, it seems likely that all entertainment and communications services in the country would become a single amorphous sector of the economy. And just as supporters of the financial supermarket concept argued that there would be increased competition among financial service providers, so members of the telephone companies are arguing that they face increased competition from VoIP (which is usually delivered over phone company lines). In order to support innovation in Internet multimedia, some form of discriminatory pricing will be required.

Where this is alarming is when the ISP's and search engines collude to apply a discriminatory pricing model to websites. We're already accustomed to search results offering the lowest prices on "The Damned of the Earth."* In the future, search engines like Google or Yahoo! would have to tweak their search algorithms to reflect the new primacy given to bandwidth-intensive services, like... CNN, Fox News, and so on. In effect, the Internet would become extremely skewed towards commercial media, and net-based activism would be vastly more difficult.

However, at this time, I am not aware of there having been any recent developments on potential legislation.
SEE ALSO: Video on net neutrality (RSNL&A)

*Not a reliable result; The Damned of the Earth is a notorious book by Frantz Fanon, and also a line from the French version of "The Internationale." I recall conducting a search in the hopes of finding the text online, and instead was offered "The lowest prices on..." OK, I thought it was really funny.
SOURCES & ADDITIONAL READING: "We Still Need Net Neutrality Legislation," David DeJean (4 July '07); "FTC Net Neutrality Report Tortures Logic," Net Neutrality, Policy Blog; "FTC abandons net neutrality," Vnunet.com; "Navigating Between Dystopian Worlds on Network Neutrality" (PDF), speech by FTC Commissioner Jon Leibowitz (Feb 2007) & "FTC Chairman Addresses Issue of 'Net Neutrality'," FTC (Aug 2006); Oligopoly Watch, "Oligopoly and network neutrality" (21 Jan 2006);

Labels: , , , , ,

03 July 2007

Competition and Homologization

"Homologization" is a term I have coined to refer to a fairly common trend in technological change. The term is derived from "homology," a term in logic in which a person argues that two things are not merely analogous—i.e., sharing similar patterns—but share a common identity or root. So, for example, someone might draw an analogy between the Internet and the system of roads; but everyone understands the two things are so dissimilar that the analogy only illustrates a peculiar pattern common to both. On the other hand, the same person might go much further in comparing the Industrial Revolution and the Internet Revolution, arguing that the two were essentially the same phenomenon occurring twice (that someone would not be me).

The homologization of two or more businesses consists of them becoming essentially the same business, albeit through somewhat different media.

Finance, Insurance, & Real Estate (FIRE)
The most famous example is the Usonian financial services sector, which had been split into several separate businesses after the financial crash of 1929. Even before the Crash, the banking sector had been partitioned geographically, which of course divided capital markets from local savings banks. Between 1933 and 1999, commercial banking, stock brokerage, investment banking, insurance, and real estate were barred from mixing into each other. A commercial bank like Citibank was barred from selling insurance or underwriting issues of new equities, nor could it offer brokerage services to customers. In the years between 1996 and 2003 there was a frenzy of M&A activity as the entire financial sector effectively merged into a few financial supermarkets.

Homologization of the US FIRE sector posed an interesting paradox: each company within that sector was now free to enter other businesses heretofore closed off to it. Commercial banks could now sell insurance; investment bankers could offer underwriting services to a wholly different clientèle. Looking at this from another angle, this reflected increased competition: in theory, each bank was now facing competition from all insurance companies, all brokerages, and vice versa. This posed a rather interesting paradox: virtually all firms in the FIRE sector were in favor of changing the regulations to allow homologization. It was the brass ring of pro-business legislation. All of the financial press praised the repeal of the Glass-Steagall Act as if it were the sine qua non of happiness. Yet the same businesses and the same business press argued, at the exact same time, that the new competition created by homologization imposed extraordinary new burdens on that same victorious FIRE sector. That meant that still more regulatory tweaking was required.

Financial services in the USA and other industrial nations tend to share certain state-like powers and benefits that make them utterly different from non-bank firms. For one thing, commercial banks have the power to create money. Investment banks have the unique power to underwrite capital issues under limited liability laws. Brokerages have exclusive access to capital markets, which are—in turn—made possible by limited liability laws. Financial services, perhaps most importantly of all, are governed by accounting laws that are the rest of us; they are allowed to bear far greater leverage against capitalization than non-financial firms. The last feature, common to the whole sector, reflects its role as a premier state surrogate: it can borrow so much money because it guarantees the greater part of the nation's sovereign debt.

I point this out because I want to make the point that the financial sector already is a part of the national polity; with the events of 1999, it was absolved from two layers of social responsibility. It was liberated from prudential restrictions on what businesses it could undertake, and it was absolved of [most] community banking regulations.

The increased-competition side of financial homologization has been, in my view, an obvious bust. The banks tended to merge with each other, and bought insurance companies, brokers, and investment banks. They did not "invade" each others' business with enhanced services. Mostly, they did increase convenience through internet automation or ATM's. However, savings/commercial banks withdrew from the auto loan markets in favor of home equity loans. In other words, the actual bundle of services offered to customers was shuffled about between banks and 3rd parties (like auto dealers). A branch bank does offer services unavailable in the early 1990's; mostly they are not services a consumer ought to use. It has abandoned useful services as well, creating new monopolies.

Telephony, PCS, ISP, Cable, & Network Broadcasting
The Internet Revolution has led inevitably to the homologization of media. Like the banking sector under the McFadden Act, the old media was regionally segmented; modern media is internationally homogeneous. However, another curious development was the short history of the ISP. In the early 1990's, the number of genuinely autonomous internet service providers (ISP's) was immense, because running an ISP required particular capital and skills that could be delivered anywhere. Telephone companies still regarded their business as telephony, and were concerned mainly with the booming personal cellular service (PCS) industry. By 2000 or so, telephony and PCS were mostly united in odd international cartels, with occasionally-overlapping service areas. The vast majority of people used their telephone company as their ISP, although a few specialized firms like Earthlink continued to survive as autonomous ISP's.

Cable television remained divided from the telephony/PCS/ISP part of communications, as did network broadcasting from both. This changed somewhat as cable companies were snapped up by computer companies like Microsoft, and as laws on cross-holdings or market consolidation in media were abolished. Fox News has enjoyed favorable treatment by Congress, and Clear Channel Communications has transformed the media delivery system in this country. Today these two companies have merged cable, radio, and television "content" production, while MSN* has merged software, ISP, and other media categories.

The same paradox has arisen here: all of the media firms involved insist they are experiencing greater competition. Telephony faces competition from VoIP; DSL faces competition from cable; cable faces competition from YouTube, network television faces competition from content-producing cable conglomerates; and so on. Yet market concentration by single firms has exploded in all markets concurrently. All participants insist, and the FTC insists on their behalf, that competition is much greater and there is no longer any reason for public interest regulation.

In the case of the media industry, it's fair to point out that the process of homologization was driven by technology. It's harder to make this claim in the FIRE businesses, where no technological breakthrough comparable to the Internet has occurred. In FIRE, there has been far less acrimony among the industry members; rather, the abolition of Glass-Steagall has permitted polyamory in the sector. In the new media industry, there is clearly a struggle between rival commercial interests, with each purporting to defend the public interest.

A final note: homologization is occurring in other industries and has in still others. It leads to an interesting aftermath, where the entire mix of products available is changed. Initially, the customer suffers through deception: even very well-informed customers are vulnerable deception by gigantic institutions interested in getting them to make sucker bets. Whether the situation improves depends in large measure on if the customer remembers that she is also a citizen.
*MSN is now known as Windows Live.

Labels: , , ,