22 June 2005

EU Research versus the US variety

This is a post about developing trends, not about the two forming rival blocs. From a comment at Europhobia (Jeff) comes this article in Red Herring (link dead) on the comparative prospects for research in the US and the UE. My own view is that the USA is suffering from market fundamentalism, which means it cannot implement a sound industrial policy (and retain manufacturing jobs) and from sectarian fundamentalism (“Christian” identity politics) which makes it impossible to cultivate future generations of scientists from the ranks of younger children. Here’s a post at Chris Mooney’s site that I agree with; it’s very short as well; here’s “The Flight From America” (via Rue a Nation). However, I do like to read alternative opinions, so let’s have a look at "Can Europe Survive?"
Red Herring: Mr. Martikainen, a Finn, started developing a router—hardware that directs streams of data from one computer to another—back in 1982 at VTT, a research institute in Espoo, Finland. The Finnish companies financing the research, including Nokia, didn’t see the potential, so the project was dropped in 1986, shortly before an American startup called Cisco commercialized similar technology. Cisco went on to dominate basic corporate networking gear, with annual sales of more than $23 billion. Mr. Martikainen today works as a professor and researcher; his prototype gathers dust in a university display
They have three such examples, then claim this illustrates a trend. However, there are literally scores of famous American concepts, like fuzzy logic, which could not flourish in US soil. There is no universally correct technical model for implementing technology.
Europe has always produced great science and technology. It just hasn’t been very good at commercializing it. Disruptive technologies like GSM—now the most widely adopted mobile technology in the world—and Linux, open-source software that is arguably the biggest threat that Microsoft has had to face to date, were invented in Europe. So was the web.

By the way, Linux is actually a kernel (by far the most widely used) for the GNU OS. Linux (the kernel) was developed by a large group of volunteers, of whom Linus Torvalds was the predominant one. It's really hard to ascribe a national identity to a program developed by literally thousands of coders working voluntarily around the world, but the GNU project was generally associated with universities in the USA. Linus Torvalds is Finnish. GSM is actually an implementation of TDMA, which is really a generic concept. I'm not trying to deflate EU research here, quite the opposite: I'm just saying that technology doesn't usually have a clear nationality unless you're talking about something so hyper-specific that it has only a few applications.

Many American entrepreneurs have gotten rich from building businesses around the Internet. Tim Berners-Lee, the British scientist who worked at the European Particle Physics Laboratory in Geneva (CERN) when he invented the World Wide Web, did not. At least, not until last June, when he was awarded a €1.2-million ($1.5-million) technology prize.

The reason CERN’s web concept did not become an entrepreneurial triumph is that there was no conceivable business model by which such a format could have made money. HTML, for example, is a fundamental element of the Web; it's the component that Tim Berners-Lee developed. Standards, by their nature, require a complex business plan to make any money at all: give away the standard so it’s propagated; give away a user access, like Acrobat Reader, or FlashPlayer, so people can see stuff that complies with the standard; and sell something that the new medium creates a demand for, like Acrobat Writer, Flash, or CGI applications. That Mr. Berners-Lee couldn’t think of a way to make money off of the HTML standard should come as no surprise. Likewise, the fact that his browser was not a commercial success; neither was Mosaic.
Europe also fell behind in computers in the 1970s, software in the 1980s, and broadband in the 1990s.
These are not industries that it is prudent to develop, from the point of view of industrial policy (except for broadband). Computers are capital-intensive industries that offer extremely high levels of risk and concentrated returns. In the USA, there were some early successes that have had little favorable impact on US industry per se: the technology itself was adopted irrespective of where it originated, and the companies that made fortunes, like Apple, IBM, Compaq, and so on had a very transitory phase of wealth creation. Besides, EU member states have never flourished in those industries.

Software is even worse. In the case of MS, about 99% of its immense return on investment has consisted of rents torn from other enterprise, often inflicting an opportunity cost far greater than the revenues captured by MS. As for the other firms—Oracle, Adobe, and the lot—much of the coding was eventually outsourced to other countries. The total number of individuals involved was tiny, and the period in which the USA reaped any sort of competitive advantage as a result of Silicon Valley is actually quite brief. The immense fortunes made there were fetishized by business magazines, but were in reality a disaster for American enterprise generally: venture capital, usually managed by prudent, hard-nosed people, was sucked into a frenzy much like the South Sea Bubble, then dissipated on stupendous waste. The new technology that emerged consists of standard items that can be manufactured anywhere. The profitable business is in proprietary devices like microchips, that are manufactured overseas.
“In the end that did not happen, in part because European research is far too dirigiste, planned top down, and in such attempts at detail, serendipity can play no role,” says Mr. Negroponte. “If you visit the MIT Media Lab, there is considerable chaos and it is highly unstructured. Its interdisciplinary nature—itself hard to do in Europe—guarantees a high degree of innovation.” The Media Europe Lab shut its doors earlier this year.
If EU research is dirigiste, that’s a problem having to do with the relationship between states and institutions; MIT Media Lab reflects the dynamics inside an university. A more fair comparison would be the climates within many American research facilities of different provenance to the many in the EU. Again, comparing a lab at MIT where there is chaos, to a totally different entity in the EU that closed, is not merely a cheap shot—it’s comparing apples and oranges. MIT Media Lab is basically an outgrowth of the MIT university culture; Media Lab Europe was a venture with MIT and the Irish government to transplant that same culture into Dublin. With all respect to Prof. Negroponte, he was the founder of MIT Media Lab, and a collaborator in the scheme to recreate it in Dublin; it is reasonable to expect him to try to blame his failure on those stodgy Europeans.

The European sense of entitlement is mentioned, although it is the USA where gigantic accountability-proof payments are made to top management; likewise, the existence of some people who are suspicious of capitalism generally (as opposed to the USA, where ideological conformity is far more strictly enforced), as well as a disjointed, complacent rant against national characters.

The last part of the article is actually not so bad, with acknowledgment that the EU is definitely making changes to facilitate startups and make labor markets flexible. However, the implication is that there is only one way to grow technology, and that way involves startups with socialization of the costs—the model that prevails in the USA. In fact, EU firms innovate at least as much as their American counterparts do, but they use different approaches. Usually, research is done inside existing firms or their Stiftungen structure, not as a startup. Had Red Herring focused on the Republic of Korea, Taiwan, or Japan, then its observations would have been more apt, but more obviously irrelevant. Startups in Japan aren’t impossible, but they’re damn close to impossible. Yet Japanese industry is world-beating. Korea’s staggering lead in consumer electronics makes the USA look like the Flintstones, but it was done through state-chaebol collaboration.

Again, there is no one model. EU member states are optimized for state-large enterprise-university collusion; the USA, for startups, privatization of profits, and socialization of costs; NE Asia, for coordinated division of labor between conglomerates and the finance ministry.

Labels: ,

Microsoft & the EU

The European Commission (of the EU; hereafter, the EC) is a very important body in antitrust action nowadays because of its rapidly increasing market power. The Competition Commission, under Italian Mario Monti, attracted international headlines by stopping the merger of Honeywell & GE; Monti also began a lawsuit against Microsoft, which has taken a while to attract the sort of attention as the Clinton-era antitrust action against the Renton, WA-based company.

The American Department of Justice action against MS was directed firstly against its pressure on retailers to bundle computers with MS Office (or face sanctions from MS), and secondly its bundling of the Internet Explorer Web browser with the Windows OS. The EC action focused not on the web browser, but on the Windows Media Player (WMV). BusinessWeek points out that the Windows XP N ("N" means there's no WMV) has been a commercial bust, and of course the whole concept of penalizing MS in this particular way is just incredibly silly. For one thing, WMV is usually available for free. The anti-bundling methods used by antitrust courts have usually culminated in forcing MS to de-bundle one of its features, i.e., make Windows available minus the feature—say, Internet Explorer, WMV, and so on. The customers always want the version with everything, so why would they buy the strippie? And it's an awkward precedent. What about PDAs equipped with GPS? Apple iPods with Bluetooth telephony? This could easily become an obstacle to incorporating technological advances once they appear.

The other front, the source code for Windows FTP server, is a much more logical tactic. Competition in the computer industry has made far more progress through licensing agreements, than efforts to control what computer firms may or may not develop.

Labels: , ,

21 June 2005

Telecommuting the Wave of the Future?

Is telecommuting the way we'll work in the future? From Inescapable Data:
When we interviewed companies for our book and researched the changes in corporate worklife in general, we became aware of the huge shift to non-office office workers. Many large companies such as IBM and Sun boast that currently 33% of their workforce has no corporate ‘office space’ any longer, possibly heading toward 50% or higher.

Well, sure, Sun and IBM have a lot of telecommuters. But what about others? Rep. Frank Wolf (R-VA) is demanding that federal agencies certify that they are increasing telecommuting opportunities for their work forces, on pain of losing funding (Kansas City Star). This article says that 6% of federal employees already do commute.
Wolf began championing a robust telecommuting program in the government about five years ago as a way to help the Washington area cut traffic congestion and pollution. But agencies have moved slowly on the issue despite encouragement from the Office of Personnel Management and General Services Administration. Many federal managers are not comfortable with the concept or are concerned that employees who work at home may be less productive.


Last year, Wolf directed the departments of Justice, State and Commerce; the Small Business Administration; and the Securities and Exchange Commission to file reports showing that eligible workers are permitted to telecommute.


Wolf has asked the Government Accountability Office to review the telecommuting reports.


Because the House expanded the jurisdiction of his subcommittee, Wolf also would require the National Science Foundation and the National Aeronautics and Space Administration to certify that they provide ample telecommuting opportunities for their work forces. Wolf asked GAO to review their plans as well.


Rep. Wolf is probably looking for ways to trim federal spending on office space and payrolls (so, for example, the pool of federal recruits don't live within daily commuting distance of the federal office). I understand some states have seen fit to modify their tax laws accordingly.
Most notable is a recent case in New York. This spring the New York Court of Appeals ruled against a Tennessee resident, Thomas Huckaby, who telecommuted for a New York company. Huckaby filed New York nonresident income tax returns, allocating his income between Tennessee and New York, based on the number of days he spent working in each state. Because he spent one-quarter of his time in New York, he paid taxes to New York on one-quarter of his income.


However, under the convenience of the employer rule, if a nonresident chooses to telecommute to a New York employer some or most of the time, the nonresident must allocate the income earned at home to New York. The court said New York could tax the telecommuter on one hundred percent of his income.


This is considered the most aggressive tax case to date that targets telecommuters, and it has drawn harsh criticism from the International Telework Association & Council (ITAC), which estimates 44 million telecommuters in the U.S. and sees the ruling as a discouragement for employers to offer telework.

If telecommuting becomes as widespread for other farmable services as it is at Sun Microsystems and IBM (improbable, of course) then we could in fact see a telecommuting workforce that really had 44 million. (And if this diet works, my weight might really approach what it says on my passport!)

19 June 2005

I've invented something and it's an odd fit-2

(part one)
To put it bluntly, I think it's a straightforward matter that the odds favor doing nothing unless a value-added reseller (VAR; includes PCS companies like Verizon, Singtel or Cingular) demands the product. I think the ideal strategy is to find other inventors in the same field, and present these together as part of a product.


In this case, I think you may need to think about the selling of the invention as part of the invention itself.

This means you have to think about how the invention would be implemented. In the case of Singtel:

Singtel is a service provider. Their job is to provide services to users of cellphones. They buy phones from another company, custom-made, perhaps add microcode, and resell them to users bundled with the PCS. This means they are likely to represent the demand for a new user interface. What I am saying is, your customer is most likely going to be a VAR.

Now, where are these VARs going to be located? If the USA, the language is English and I would argue that your ideal user is not Kristie Midriff, but industrial firms that provide handhelds: Psion Teklogix. The reason is that these are already firms that provide custom devices with specialized interfaces, and they may well need something with a small, compact that they can adapt. This requires you learn about industrial handhelds.

You then take this knowledge and use it to refine your product.

Also, you need to do a lot of research. There are networks like Skype, and I've blogged about them. You need to use them to get in touch with other inventors who are in the same field and have complementary technologies, because that is how you are going to make your patent attractive to others: by finding complementary ones, contacting the developers, and forming alliances. This will require diplacy and knowledge. Do you feel that you are up to this? This is what inventing is all about.

17 June 2005

I've invented something and it's an odd fit. What do I do now?

A friend of mine developed a design to improve user interface with PDAs. It's a pretty cool concept, but it's a very peculiar fit. For an OEM to adopt a new user interface—that's a very important new step. For over a decade now, the manufacturers of PDAs have spent literally billions on trying to make text entry work. The cell phone is gradually merging with the PDA and evolving along the same lines; yet the interface used on cell phones for SMS is not very well suited for it. At the same time, the QWERTY keypad used on larger PDAs like the RIM Blackberry are not very well suited to SMS.* People put up with them, but it's the demand for SMS (and MMS) is driving the growth both in PDAs and in 3G cell phones—not their disappointing user interface.

So you'd think a new user interface would receive an enthusiastic hearing.

Well, when my friend described his idea to me, I was quite skeptical. I strongly suspect that there exists a mathematically ideal arrangement of keys—both the number of keys, and the arrangement of those keys. I think a reasonably efficient team of mathematicians and programmers could find that arrangement in about a week, and I think that arrangement is on file somewhere at Samsung, LG, Ericsson, and Nokia. I also believe that slightly less reliable information is on file with these firms on the cost of these phones, and less reliable information still on the benefits.

What that means is that our typical large 3G/PDA OEM** knows with 100% certainty what the best design is; knows with 75% certainty what the costs of implementation will be; and knows with 25% certainty what the commercial benefits of implementation will be. Each firm has a patent on its version, or can get a patent whenever it needs to, because each firm used slightly different constraints in finding its optimal design. The managers of the firms understand that their estimates are flawed; they know that they err (a lot) on the side of conservativism when estimating the benefits, and on the side of excess when estimating the costs. But statistically, those "errors" are the way to bet, and so they do.

(To be continued)

* SMS: short message service; MMS: multimedia messaging service
** OEM: original equipment manufacturer; VAR: value-added resaler

Labels: ,

15 June 2005

RIM Lawsuit (Part 2)

Part 1 was background to my other email on the NTP vs. RIM lawsuit that's going on now. RIM stands for Research in Motion, a company based in Waterloo, Ontario, Canada; NTP is a patent holding company (i.e., a legal entity created to own and defend patents) based in Virgina.

One thing I wanted to mention, because it seemed to be on your mind, was the lawsuit between RIM (developer of BlackBerry) and NTP (the patent infringement plaintiff). This suit pertained to the transmission of data between individual BlackBerry units and transmission relay units. The lawsuit applied to all RIM devices, and did not affect the keyboard design. Thomas Campana was both the inventor of this technology and the founder of NTP, about which almost nothing is known (all I know is traced back to publicized court documents and the Gartner Dataquest's Todd Kort). Campana appears to have done his work in the early-mid 1980's while at AT&T.

According to Kort, who did the research on the NPT vs. RIM case, Campana's behavior was awfully fishy. He came up out of nowhere in 2000 with this lawsuit, but he had the documentation and stomped RIM legally. I think most technology experts think his case was flimsy, but it pertains to the geographical location of patent infringement.

Campana has sinced died, but NTP naturally continues the fight.

Anyhow, RIM settled and its stock price soared. Unfortunately,
Forbes/New York Times: "As Patent Deal Unravels, Anxiety Rises at BlackBerry Maker"

For three and a half years, patent claims by NTP, which is based in Arlington, Virginia, have been a cloud over Research in Motion, based in Waterloo, Ontario. But the announcement last week that a $450 million agreement reached in March was unraveling came at a particularly delicate time.

In a court filing that followed, the privately held NTP indicated that if the settlement cannot be revived, it plans to invoke an injunction banning sales of BlackBerries and their e-mail service throughout the United States.

That injunction, which was put on hold during an appeal by RIM, has grown more powerful with time. After years of having the wireless e-mail market more or less to itself, RIM now faces competition from hardware makers like Palm Computing and software vendors including Seven Networks, Good Technology and Visto.

For now, a shutdown of Research in Motion in the United States, where the company gets about three-quarters of its revenue, is far from certain. But the renewed legal uncertainty is almost toxic for some members of the investment community
I am not competent to say if this is a serious possibility or not. But RIM is one of the biggest names in PDAs.

Data Freeway

FTP stands for file transfer protocol. It's the standard for sharing files (of any format) between different operating systems. It allows a webmaster to communicate with the server hosting her homepage.

FTP clients
are programs that allow a webmaster to upload or edit the files of a web page. Unlike "normal" files like your term paper (that you wrote in MS Word), the files that are incorporated in your webpage reside on another computer—an FTP server. FTP servers are also called hosts. You are only going to need an FTP client if you have a web page. Even then, you may not need one: this site can be maintained without an FTP client. Most personal web publishers, like Nucleus CMS, Movable Type, bBlog, WordPress, b2evolution, boastMachine, Radio, and Drupal* have file uploading built in. Likewise, Macromedia Dreamweaver has an FTP client built in.

However, these are often inadequate. Movable Type only allows one to upload files; taking them down or editing them, or re-arranging the file system (like, for example, putting images in subdirectories) is impossible. I've never used the other publishing CGI applications, so I can't comment about them.

For those of you who—like me—always want free stuff, there is a free download of an excellent FTP client available:

So now you know. I've been using this one for several months and I think it's superb.
* List of web publishers and links via Wikipedia

Labels: , , ,

They built a better Paint—and it's freeware

Dear Readers, I now have to make some remarks for fellow Windows users.

Windows comes bundled with a software called Paint. Click the start button and select programs, then select accessories. It should be there along with the DOS shell and Wordpad. If you've used a lot, as I have, you know it has some serious limitations. The worst limitation is the way it saves JPEGs. Basically, if you open a bitmap file and save it as a JPEG, it looks like—er, uh, it looks terrible. Suppose the bitmap is the Japanese naval ensign. In BMP, this is a circle of solid red on a field of solid white. Save it as a JPEG, and there's a mist of tiny reddish ripples leaking into the white. Immaculate faces look like they suffer severe acne or scarring.

MS Paint's GIFs are about as bad: that BMP of a Japanese flag now is a crisp circle of solid muddy grayish brown on a pure white field. At least we can't mistake it for the flag of Bangladesh!

Imagine my joy to discover there is a better way: Paint.Net.

Paint.Net is freeware and you can download it here. When you save JPEG's, you get to chose the level of quality (on a scale of 1-99%). If you pick 95%, the image quality is still quite satisfactory, and the file is about an eighth the size of a BMP. The links to graphics of flags are to GIFs, but the colors are true.


Screencapture of Paint.Net (click for larger image)

Another reason to download this program is that the tools for editing files are vastly more powerful. A lot of the effects in Photoshop, for example, are there in Paint.Net.

One feature I would love to see them adopt, however: rotating an image, or a selected part of an image, an arbitrary number of degrees. A lot of times I just to to tilt something 5 degrees.

UPDATE (10 March 2007): I added the illustration above. Also, please note that the current version of Paint.Net has a command, [Shift]-[Control]-[z] (or Layers | Rotate-Zoom), which does indeed allow one to rotate the image or any part thereof an arbitrary number of degrees. One can also tilt the plane of the image, as shown below.

Same image as above, "tilted" backward

Labels: ,

Mac & Windows

I'm going to have to admit that I do everything in MS Windows. For years I dreamed of a Macintosh, but put aside those dreams when Steve Jobs ended licensing of the Mac OS, and insisted on a monopoly of the proprietary operating system. Since then, the user base of Mac OS has dwindled as a share of the total, and Job's focus has been on Mac as an expression of profound individuality. However, I have observed repeatedly that there is an awful lot of really good creative blogging out there being done by Mac users. I've noticed that the Mac user base, far from being the group of technical novices I would have expected, is actually more technically competent and shrewd than the Windows user base is.

I still have this big ugly pool of frustration with Steve, because I think he spent so much energy marketing Apple products as fashion accessories, but I've gotten older and increasingly I recognize that Apple products have retained a crucial technical edge over Intel ones in many respects. The anarchistic piece of patches and bugs that is Windows essentially uses up all of the skill and talents of coders. The hideously dysfunctional business climate for Windows software developers consigns entire cohorts of coders to arbitrarily-inflicted obsolescence. There's a great Japanese word for this, muda, which IMO describes about 75% of the effort in Windows development. Here we are, twenty years after the development of the GUI, and we're taking up Linux as the alternative to Windows. The fact is that Windows is not a programming environment, it's a cross between the Cold War and the Bosnian Civil War. The Mac OS community is, by this analogy, a stable productive society. So Steve handled the technical issues successfully.

As to the marketing: I have a really tin ear. My ability to anticipate the effect on the public of a certain thing, like the presidential debates, is so poor I've had to give up on the exercise entirely. The sort of reasoning that win arguments nowadays is not my reasoning.

That's where we're going with this: the market of ideas and products as a programming environment—a civil OS, as it were. I'm used to thinking of operating systems as a bunch of specialized code that you install on a computer so the damn thing can open Word files. But it turns out operating systems are a metaphor for society. The components of the program (institutions) work by collecting what we know about pieces of data (individuals) and associating that data in ways that serve the survival of the institution. The system files mostly evolved in different places, and they don't necessarily work together very well together, but it's much to late and too difficult and too controversial to revise them so they make any sort of sense. There's also the core of the OS, which gradually absorbs more functions of the system software (software drivers, social welfare systems and child protective services) but there's always going to be things that the OS cannot do, that have to be done (things that the state cannnot do, that require spontaneous or traditional associations).

And I notice that the programming API (the standards that applications have to fit into in order to run in a programming environment) for our society is set not by political philosophers like John Locke or Thomas Aquinas, but by fashion. This is not a terribly original idea, but it's something I keep overlooking. Karl Marx, John Stuart Mill, W.E.B. DuBois, and Joan Robinson came up with really compelling analytical models to explain how societies interact with their institutions—yes, they did. But these models are terrible at predicting the future, and people who understand them become worse, not better, at reading the responses of their neighbors.

The Mac OS deserves to beat out the Windows OS because it works like a properly engineered design. It's a more effective, waste-avoiding strategy of doing what programmers and computers are supposed to do, than Windows ever will be. It spawns creativity; the anarchy of Windows, as we've seen (and seen, and seen, and seen again) spawns drudgery and frustration. This is not because Windows programmers are stupid; nor do programmers code for Windows because they're masochists. They do it to earn incomes and get their ideas on people's desktops. Windows is the world in which we live; Mac OS is an enclave of sanity. It's like a tiny little corner of Bosnia where Serbs, Croats, and Bosniaks live in harmony. The rest of the country is chaos, and that's the world of computing: massive waste of potential because of chaos. People never actually chose chaos, but that's what we get. So we plan for it, and we assume it.

And that's why fashion sense—a system of hierarchy that explicitly repudiates any logic—is the law of the jungle. Jobs understands that. Why did I not see this before?

Labels: , ,

14 June 2005

On a Foray into HTML-3

SunSoft Java[*], and Netscape JavaScript [*] are closely related ideas. They're both programming languages that are commonly associated with the internet. The similar names are just a coincidence, however, and they refer to very different things. In this blog post and others, I'm going to refer to a program and its elements as "code." You could say programs are written with code. I also will use a term, "compiler." This is a program that reads code written in a high-level language and translates it into assembly language so the computer can do what it's supposed to do.

Java was developed about the same time as Mosaic, the first Web browser. Most computers supplied since 1990 have a "Java virtual machine" (JVM) that is a compiler for Java code. This allows Java code to be read by any browser anywhere, any time, regardless of the computer on which one is browsing the web. The VM is common to all browsers, regardless of flavor (this is not STRICTLY true!).

An application is any program that you need a computer for, such as word processing or managing a database. An application written in Java is called an applet. An applet can do pretty much anything that a conventional application can do; so, for example, this list of applets includes calculators, graphers, simulators; an MP3 player; chat rooms, email programs, and spam blockers are also written in Java.
What about JavaScript? JavaScript was created by Netscape as a simple set of commands that all browsers would recognize. Unlike Java, which is a completely separate programming language, designed for autonomous applications, JavaScript is a set of commands recognized by browsers. JavaScript programs, or scripts, are usually embedded directly in HTML files. The script executes when the user's browser opens the HTML file.

JavaScript allows the person visiting your website to interact with the site. A simple script involves letting the visitor select the background color of the page. Another script can detect the user's operating system and browser type, then give instructions that are appropriate to the user's particular computer. A third type evaluates user input. Drop down menus and combination bars are things that you can do with JavaScript.

(To be continued.)

Labels: , , , , , ,

On a Foray into HTML-2

This post has been edited for accuracy

So, to recap: the Web and the Internet are similar and it's reasonable for people to use them as synonyms. It's just that the Web is what individual computer users have created with HTML, in the medium of the Internet. The Internet is older; it's th foundation and building material of the Web.

Web pages are created with HTML. This simply a file type that can be read by a browser. Web pages are "made" of HTML; HTML is a high-level computer language that explains to the browsers visiting the site how to display the text and images hosted at the website.

In addition to the HTML files that the browser reads, there are elements that the browser is told to display. Web browsers are designed to "read" (recognize the format and display accurately) JPEG images (*.jpg), GIF images (*.gif), TIF (*.tif), and bitmaps (*.bmp). They can also recognize other types of files, which I'll describe in a moment.

In addition to HTML files, the above-mentioned image files, and Java or Javascript files, you can post pretty much any type of file you want on your website. However, in order to read things like an MS Word document or Acrobat PDF, you need to have the software installed on your computer. Hence, the popularity of Adobe Acrobat. The software for reading PDF files is free; people pay to buy the software for creating *.pdf files. These files will display in a new window of the browser, or a window spawned by the browser's computer (i.e., Windows or Mac OS will launch MS Excel if you open an Excel file at a website).

WHAT ARE SOME OTHER FILE TYPES YOU CAN HAVE?
You can have MPEG's, which are files that are either audio, video, or both. MPEG refers to a standards committee (like you needed to know that!), and this committee keeps issuing new formats. MPEG-2 is the standard used for most *.mpg files. A variant is MPEG-4, which was modified to create the Windows Media Video (*.wmv, or "Wave") format; Apple Quicktime (*.mov) is a third. These file types can be created by different softwares, and they can be played back by freely ditributed playback software. Like Adobe Acrobat, the player is usually free, and the computer's operating system must spawn the player for it to be seen. The file formats are mutually incompatible, although some players can play more than one format.

In addition to these, there is Macromedia Flash/FlashPlayer. This is like the others, except that Flash allows one to create a digital image by manipulating objects in the Flash software; it's like MS PowerPoint, with the ability to animate the presentation and upload it to the web. Flash files (*.swf) are typically viewed as an animated graphic within the web page; it's not usually necessary to spawn a new window for playback. As a result, one can combine animated and non-animated elements in a single page. Also, Flash is very easy to use, in my opinion.
COOL STUFF I NOTICED LATER: Here's a blog post about new features available in the latest release of Flash (hat tip to Wikipedia's Flash entry).
(To be continued)

Labels: , , , ,

13 June 2005

On a Foray into HTML

Some terms of art for the web:

Some of you are going to hear some technical language used here that is quite intimidating. A case in point is the jargon associated with web pages, the internet, and so on. The fact that many of these terms have multiple meaning doesn't make it easier, but let us hope this does.

First, many people surfing the internet may be a little confused by the terms, "internet" and "web." These are almost, but not quite, synonyms. The internet is a network of networks that is connected (at least initially) through the telephone lines, using signals much like voice transmission. Modems used a universal standard for exchanging data through the phone lines, called TCP/IP. This format was developed in 1969 though the Advanced Research Projects Agency (ARPA), a branch of the Department of Defense. Much later, a protocol called HTML was developed that allowed web browsers to treat data sent over modems and convert this into graphical images, such as a web page. At the same time that HTML was invented, web browsers were also invented by the National Center for Supercomputing Applications (NCSA). It's easy to see why browsers and HTML had to be invented concurrently: a browser had to be able to translate data from a modem into an image that could be displayed, and there had to be a standard that allowed browsers to speak to each other.

The internet was initially useful to computer terminals connected to mainframes, running arcane software like FTP, Usenet and Gopher. I recall having a lot of friends who were familiar with these services and talked about them a lot, and finding it inconceivable that these things would ever amount to anything but costly nerd toys. In 1992, however, Mosaic emerged as the first graphical browser, thereby creating--in a stroke--the world of interconnected hypertext we know as the "Web."

(To be continued.)

Labels: , , , ,

07 June 2005

Intel inside Macs? Oh, the humanity!

First, United Airlines has teamed up with Verizon to offer Wi-Fi to passengers (hat tip to Digital Lifestyles). What this means is that UAL has gotten FAA approval to install Wi-Fi on Boeing 757-200's, which remain a rather small part of the UAL fleet. When can you start updating your blog over the Pacific Ocean? According to Forbes, UAL still needs FCC permission; a firm date is expected by August, and the launch is presently scheduled for sometime next year.

Also big news, to me: the heartbreak that Apple Macintoshes are switching over to Intel (i.e., they're dumping the IBM PowerPC chips). This is really huge news, and very deeply disappointing. For years Power PC's not only powered Apple computers and blazed some very bold trails in semiconductors, they also powered IBM workstations, minicomputers, and parallel supercomputers. I was really excited when the PowerPC hit the streets' in the mid-1990's because I was expecting it to serve as the bridge for topflight workstations running Windows, MacOS, and supercomputing applications. None of this panned out, and after Jobs returned to Apple as chief executive (and scuppered the NextStations), he pulled the plug on licensing the MacOS. Mac became the costly, "cool" computing platform, and your correspondent the destitute dorkwad, disconsolate, bought a PC.

The PowerPC was developed jointly by then-CPU titan Motorola, IBM (from its in-house POWER chips, which used RISC architecture but ALU's that could calculate scare roots in a single clock tick), and Apple. IBM does continue to support the PowerPC in its systems, and another big user is the auto industry—to control emissions systems—but in the future, the big consumer of the PowerPC will be the video game industry, with its voracious demand for graphics. The PowerPC was the first chip to feature an integral digital signal processor.

Labels:

06 June 2005

Fire Wire

What is FireWire? According to the Apple Website:

FireWire is a cross-platform implementation of the high-speed serial data bus ...that can move large amounts of data between computers and peripheral devices. It features simplified cabling, hot swapping, and transfer speeds of up to 800 megabits per second (on machines that support 1394b).

Major manufacturers of multimedia devices have been adopting the FireWire technology, and for good reason. FireWire speeds up the movement of multimedia data and large files and enables easy connection of digital consumer products — including digital camcorders, digital video tapes, digital video disks, set-top boxes, and music systems — directly to a personal computer.

Standards have to be defined; there's an association of electronic engineers (IEEE) that develops these things, and they issued a series of articles that said how FireWire would work (1394a & b). If you're interested in a detailed technical introduction to FireWire, click the link above and scroll down to the end of the article for more links.

For those of you like me who just need to know the significance: there are two ways of transmitting data among microchips, serial and parallel. In serial, the bits flow along a single corridor, one after another. In parallel, there is one lane for each bit in the "word" and so a 64-bit computer (like the Intel L2 Pentium 4 or AMD Athlon) communicates with a 64-lane freeway. Wireless communication, of course, doesn't really allow for parallel transmission of data; it requires serial communication.

This was a big deal for Apple, whose Macintosh line uses serial pots for most peripherals. Apple was the main developer of FireWire, but unfortunately, stumbled somewhat over licensing the technology (Wikipedia). Since the 1990's, and the demise of SCSI, FireWire has made a comeback. This may have something to do with Apple's continued involvement (control) over the development of its peripherals, which has gotten it more and more closely connected to Sony and the consumer electronics industry.

If FireWire becomes the most popular format for serial buses, I imagine those devices not using it would have to convert rather quickly, since the market is less tolerant of oddball formats than it used to be. This would be a small, but important victory for Apple, Sony, and the consumer electronics mode of computer use.

Labels:

05 June 2005

What is Dynamic Programming?

Dynamic programming has nothing to do with computer programming; the name was coined in the 1940's by the American mathematician Richard Bellman in the 1940's back when "programming" did not refer to the business of writing computer code. Basically, DP was developed as a tool for finding the best path for an activity. The definition of "best" depended on context. Most of what I know about the subject comes from Mathematical Optimization and Economic Theory by Michael D. Intriligator (1971, Prentice-Hall).

Some of you will want to brush past me; here are three online resources on DP:
  1. "Dynamic Programming," David B. Wagner (courtesy Wikipedia)
  2. "A Tutorial on Dynamic Programming," Michael A. Trick (courtesy Wikipedia)
  3. Lecture Notes on Optimization, Pravin Varaiya, Chapter 9 (1971; 1998; PDF; courtesy Alex Stef).
Prof. Varaiya's link is a math book, so be advised; it's in PDF and very well-done. The part about DP is on PDF-129 and following.

In the meantime, I'll be discussing a few concepts as I skim them in this textbook on my lap.

First, a note on the word "programming": this is a mathematical term initially applied to the problem of choosing values of certain variables so as to maximize or minimize a given function, relative to constraints. For example, suppose you can enclose a yard with a fence of finite length. The area is a*b, but the fence cannot be longer than 2a+2b=C. This is a pretty basic problem in calculus. You can use much more complex versions of this problem to find the optimal shape of a part used in a machine, or the optimal path of a rocket being launched into space.

(To be continued)

Labels:

04 June 2005

What is DMB and why is it trying to crawl into my ear? (Part 2)

Part One

The Republic of Korea, and to a somewhat lesser degree, Japan, have adopted widespread usage of DMB in order to allow massive concentration of wireless activity. Cellular telephones in the RoK already are incorporating 3G features that require broadband connections; the lines beteen personal computers and PDAs have faded, leading to a youth culture in the country that is highly tech-savvy. Radio broadcasting has become audio-internet in NE Asia; television is on its way to losing the passive "idiot box" orientation of its first 50 years, and becoming more like the internet.

Some of this is, of course, overhyped, and it's easy to get swept up in the anticipation of an arcadian future for these countries, a sort of NE Asian "School of Athens." Koreans are not going to suddenly abandon game shows for historical epics, merely because producers have pay-per-view. However, the technology is likely to enable radically change the way people consume entertainment. For one thing, the choice will be far greater, and the real demand for "content" is going to not be, as it as before, advertizers; now it will be viewers. My guess is that consumers will pay more attention to programs, since they will pay for them.

The old arrangement under which a TV entertainment consumer videotaped programs while away, then watched them at leisure, is going to be replaced by a server in a warehouse responding to signals from consumers. The consumer will have the PCS transmit the movie or episode of a soap opera