29 June 2005

What is CSS?

Many of you who have struggled with setting up webpages in (e.g.) LiveJournal may have noticed that much of the web design and configuration is done through cascading style sheets (CSS). A style sheet is a part of web site that defines attributes of certain tags, so that a large and complex web site can be encoded with custom tags. The site’s look and feel is determined by CSS scripts (little bits of descriptive code), while the definitions of the document’s structure are written in another markup language, like HTML. A stylesheet is like the legend found at the front of an atlas; the rest of the maps found in the atlas will use the symbols to indicate features, and direct you to page xvii so you know what they signify.

CSS actually works with different types of markup languages, such as XHTML, HTML, XML, and scalable vector graphics (SVG).

The virtue of stylesheets is that one can use them to set up sites like Blogger (where you are now!) , where users can choose a format without knowing any HTML at all. While I have used some HTML to edit the template of Reshaping Narrow Law and Art, I did not have to. In fact, as this site from Zen Garden shows, you can actually have the visiting browser chose the look and feel of the site. By clicking on the “design” options on the right, you are simply directing your browser to another stylesheet, while the underlying web page remains the same.

________________________________________________
UPDATE (12 Oct 2007): More on CSS here
________________________________________________
NOTES
tags: in markup languages like HTML, XML, TEX, and others, the programmer uses tags enclosed by some symbol, like "<" and ">." The tag is a command to the interpreter (such as your web browser) to read the subsequent text in a particular way. Those of you familiar with old DOS-based word processing applications like WordPerfect 5.1 or WordStar will recall that it was possible to select F12 and view the formatting. I recall WP tags were in square brackets, thus:
           [BOLD]this is bold[bold] this is not bold.
Years later I learned that an essentially identical system is used to generate the WSIWYG displays of modern GUI world processor applications.
________________________________________________
SOURCES & ADDITIONAL READING: Andrew Fernandez, CSS tutorial;

Labels: , ,

What is XML?

(Part 2)

For those of you wondering what XML is, but too embarrassed to ask, XML stands for Extensible Markup Language. It is a meta-language, which means it is a formula for defining languages that employ tags to define the appearance of document elements enclosed in those tags. A really common markup language is HTML; but XML entered common usage several years after HTML in order to correct several problems with HTML's "looseness," or tolerance for sloppy usage. As a consequence, HTML is not an implementation of XML; XHTML, or extended HTML, is.

What this specifically means is that XHTML complies with the guidelines for defining an XML-compatible language. As such, XHTML is known as an XML application, and is one of literally hundreds of applications. Others include MathML, used for exchanging mathematical formulae online; CML, for Chemistry Markup Language; X3D, the current version of the virtual reality markup language (VRML); RSS (for syndicated feeds of blogs or news sites); SVG (for scalable vector graphics); and something known as XFDL (extensible forms description language), which is required for sharing data among e-commerce sites.

XML was derived as a dialect of SGML (a “subset”) that could be delivered over the Internet. It has a parallel history to HTML; HTML 3.2/4.0 were compatible with XML 1.0 []. However, the two are not substitutes. While HTML tells a browser how to display data, XML merely describes the data. XML was created to structure, store and to send information.

W3 Schools XML Tutorial: The tags used to mark up HTML documents and the structure of HTML documents are predefined. The author of HTML documents can only use tags that are defined in the HTML standard... XML allows the author to define his own tags and his own document structure. ...Tags are "invented" by the author of the XML document.

W3 Schools XML Tutorial: When HTML is used to display data, the data is stored inside your HTML. With XML, data can be stored in separate XML files. This way you can concentrate on using HTML for data layout and display, and be sure that changes in the underlying data will not require any changes to your HTML.

XML data can also be stored inside HTML pages as "Data Islands". You can still concentrate on using HTML only for formatting and displaying the data.
XML is an official standard maintained by the World Wide Web Consortium (W3C), an international foundation founded by Tim Berners-Lee.

Since HTML is a language with a defined vocabulary, and XML has no defined vocabulary—that's the job of the application—all web pages created with XML applications absolutely positively must have a stylesheet that specifies what the terms actually mean. There are currently two rival formats for stylesheet languages—CSS and XSL. The latest versions of both are compatible with XML, and while XSL is specifically designed for use with XML, CSS2 is compatible with HTML (XSL can generate, but not read, HTML documents [* via *]).

Labels: ,

Adobe & MacroMedia

For heavy-duty webmasters, two of the most important names in software are Adobe and Macromedia. Adobe's main offering for the web is the Acrobat PDF reader, a freely downloadable program that reads files created in the PDF format the same way: as a virtual page (HTML, in contrast, not only has lines that break differently depending on the pixel-size of the browser window, but appears differently depending on the flavor and configuration of the web browser). In addition to PDF writers and readers, Adobe supplies FrameMaker, an XML authoring software, and GoLive, a CSS authoring tool.

While Adobe offers several graphical softwares, Macromedia is a little more focused on web animations and web authoring. Many people may be familiar with DreamWeaver, a pretty big package for creating web pages and populating them with JavaScripts, etc.

In mid-April Adobe acquired Macromedia, as part of a strategy to compete more effectively against Microsoft. Christopher MacKay pointed out some obvious (and still unresolved) questions arising from the merger:

What will be most interesting—as it always is in such cases — will be what survives and what doesn't.
  • Dreamweaver vs. GoLive (DW, I think)



  • Flash vs. SVG (expect to see SVG support tossed on the heap next to LiveMotion)

  • ImageReady vs. Fireworks (hard to say, I haven't used Fireworks)

  • Freehand vs. Illustrator (those who like Freehand like it a lot, and it's survived being moved from Aldus to Altsys to Macromedia. I suspect they'll integrate the Flash-friendly elements into Illustrator and retire it.)

  • FlashPaper vs. PDF (neither will perish)

  • InDesign vs. Quark (oh wait, that's a different acquisition...)

He also points out that Macromedia has sold Fontographer to FontLab, eliminating intra-firm competition between Fontographer and Postscript.

At the heart of this is Adobe's impending battle with Microsoft (c|net; hattip to Ryan). MS has launched Acrylic and Metro; the latter is intended to compete directly with Adobe's PDF standard. Acrylic, incidentally, is based on the software of HK-based Creature House, a company that Engulf&Devour MS acquired. In typical MS fashion, the new software loses functionality (the Acrylic will not run on the Mac, while Expressions did).

Labels:

22 June 2005

EU Research versus the US variety

This is a post about developing trends, not about the two forming rival blocs. From a comment at Europhobia (Jeff) comes this article in Red Herring (link dead) on the comparative prospects for research in the US and the UE. My own view is that the USA is suffering from market fundamentalism, which means it cannot implement a sound industrial policy (and retain manufacturing jobs) and from sectarian fundamentalism (“Christian” identity politics) which makes it impossible to cultivate future generations of scientists from the ranks of younger children. Here’s a post at Chris Mooney’s site that I agree with; it’s very short as well; here’s “The Flight From America” (via Rue a Nation). However, I do like to read alternative opinions, so let’s have a look at "Can Europe Survive?"
Red Herring: Mr. Martikainen, a Finn, started developing a router—hardware that directs streams of data from one computer to another—back in 1982 at VTT, a research institute in Espoo, Finland. The Finnish companies financing the research, including Nokia, didn’t see the potential, so the project was dropped in 1986, shortly before an American startup called Cisco commercialized similar technology. Cisco went on to dominate basic corporate networking gear, with annual sales of more than $23 billion. Mr. Martikainen today works as a professor and researcher; his prototype gathers dust in a university display
They have three such examples, then claim this illustrates a trend. However, there are literally scores of famous American concepts, like fuzzy logic, which could not flourish in US soil. There is no universally correct technical model for implementing technology.
Europe has always produced great science and technology. It just hasn’t been very good at commercializing it. Disruptive technologies like GSM—now the most widely adopted mobile technology in the world—and Linux, open-source software that is arguably the biggest threat that Microsoft has had to face to date, were invented in Europe. So was the web.

By the way, Linux is actually a kernel (by far the most widely used) for the GNU OS. Linux (the kernel) was developed by a large group of volunteers, of whom Linus Torvalds was the predominant one. It's really hard to ascribe a national identity to a program developed by literally thousands of coders working voluntarily around the world, but the GNU project was generally associated with universities in the USA. Linus Torvalds is Finnish. GSM is actually an implementation of TDMA, which is really a generic concept. I'm not trying to deflate EU research here, quite the opposite: I'm just saying that technology doesn't usually have a clear nationality unless you're talking about something so hyper-specific that it has only a few applications.

Many American entrepreneurs have gotten rich from building businesses around the Internet. Tim Berners-Lee, the British scientist who worked at the European Particle Physics Laboratory in Geneva (CERN) when he invented the World Wide Web, did not. At least, not until last June, when he was awarded a €1.2-million ($1.5-million) technology prize.

The reason CERN’s web concept did not become an entrepreneurial triumph is that there was no conceivable business model by which such a format could have made money. HTML, for example, is a fundamental element of the Web; it's the component that Tim Berners-Lee developed. Standards, by their nature, require a complex business plan to make any money at all: give away the standard so it’s propagated; give away a user access, like Acrobat Reader, or FlashPlayer, so people can see stuff that complies with the standard; and sell something that the new medium creates a demand for, like Acrobat Writer, Flash, or CGI applications. That Mr. Berners-Lee couldn’t think of a way to make money off of the HTML standard should come as no surprise. Likewise, the fact that his browser was not a commercial success; neither was Mosaic.
Europe also fell behind in computers in the 1970s, software in the 1980s, and broadband in the 1990s.
These are not industries that it is prudent to develop, from the point of view of industrial policy (except for broadband). Computers are capital-intensive industries that offer extremely high levels of risk and concentrated returns. In the USA, there were some early successes that have had little favorable impact on US industry per se: the technology itself was adopted irrespective of where it originated, and the companies that made fortunes, like Apple, IBM, Compaq, and so on had a very transitory phase of wealth creation. Besides, EU member states have never flourished in those industries.

Software is even worse. In the case of MS, about 99% of its immense return on investment has consisted of rents torn from other enterprise, often inflicting an opportunity cost far greater than the revenues captured by MS. As for the other firms—Oracle, Adobe, and the lot—much of the coding was eventually outsourced to other countries. The total number of individuals involved was tiny, and the period in which the USA reaped any sort of competitive advantage as a result of Silicon Valley is actually quite brief. The immense fortunes made there were fetishized by business magazines, but were in reality a disaster for American enterprise generally: venture capital, usually managed by prudent, hard-nosed people, was sucked into a frenzy much like the South Sea Bubble, then dissipated on stupendous waste. The new technology that emerged consists of standard items that can be manufactured anywhere. The profitable business is in proprietary devices like microchips, that are manufactured overseas.
“In the end that did not happen, in part because European research is far too dirigiste, planned top down, and in such attempts at detail, serendipity can play no role,” says Mr. Negroponte. “If you visit the MIT Media Lab, there is considerable chaos and it is highly unstructured. Its interdisciplinary nature—itself hard to do in Europe—guarantees a high degree of innovation.” The Media Europe Lab shut its doors earlier this year.
If EU research is dirigiste, that’s a problem having to do with the relationship between states and institutions; MIT Media Lab reflects the dynamics inside an university. A more fair comparison would be the climates within many American research facilities of different provenance to the many in the EU. Again, comparing a lab at MIT where there is chaos, to a totally different entity in the EU that closed, is not merely a cheap shot—it’s comparing apples and oranges. MIT Media Lab is basically an outgrowth of the MIT university culture; Media Lab Europe was a venture with MIT and the Irish government to transplant that same culture into Dublin. With all respect to Prof. Negroponte, he was the founder of MIT Media Lab, and a collaborator in the scheme to recreate it in Dublin; it is reasonable to expect him to try to blame his failure on those stodgy Europeans.

The European sense of entitlement is mentioned, although it is the USA where gigantic accountability-proof payments are made to top management; likewise, the existence of some people who are suspicious of capitalism generally (as opposed to the USA, where ideological conformity is far more strictly enforced), as well as a disjointed, complacent rant against national characters.

The last part of the article is actually not so bad, with acknowledgment that the EU is definitely making changes to facilitate startups and make labor markets flexible. However, the implication is that there is only one way to grow technology, and that way involves startups with socialization of the costs—the model that prevails in the USA. In fact, EU firms innovate at least as much as their American counterparts do, but they use different approaches. Usually, research is done inside existing firms or their Stiftungen structure, not as a startup. Had Red Herring focused on the Republic of Korea, Taiwan, or Japan, then its observations would have been more apt, but more obviously irrelevant. Startups in Japan aren’t impossible, but they’re damn close to impossible. Yet Japanese industry is world-beating. Korea’s staggering lead in consumer electronics makes the USA look like the Flintstones, but it was done through state-chaebol collaboration.

Again, there is no one model. EU member states are optimized for state-large enterprise-university collusion; the USA, for startups, privatization of profits, and socialization of costs; NE Asia, for coordinated division of labor between conglomerates and the finance ministry.

Labels: ,

Microsoft & the EU

The European Commission (of the EU; hereafter, the EC) is a very important body in antitrust action nowadays because of its rapidly increasing market power. The Competition Commission, under Italian Mario Monti, attracted international headlines by stopping the merger of Honeywell & GE; Monti also began a lawsuit against Microsoft, which has taken a while to attract the sort of attention as the Clinton-era antitrust action against the Renton, WA-based company.

The American Department of Justice action against MS was directed firstly against its pressure on retailers to bundle computers with MS Office (or face sanctions from MS), and secondly its bundling of the Internet Explorer Web browser with the Windows OS. The EC action focused not on the web browser, but on the Windows Media Player (WMV). BusinessWeek points out that the Windows XP N ("N" means there's no WMV) has been a commercial bust, and of course the whole concept of penalizing MS in this particular way is just incredibly silly. For one thing, WMV is usually available for free. The anti-bundling methods used by antitrust courts have usually culminated in forcing MS to de-bundle one of its features, i.e., make Windows available minus the feature—say, Internet Explorer, WMV, and so on. The customers always want the version with everything, so why would they buy the strippie? And it's an awkward precedent. What about PDAs equipped with GPS? Apple iPods with Bluetooth telephony? This could easily become an obstacle to incorporating technological advances once they appear.

The other front, the source code for Windows FTP server, is a much more logical tactic. Competition in the computer industry has made far more progress through licensing agreements, than efforts to control what computer firms may or may not develop.

Labels: , ,

21 June 2005

Telecommuting the Wave of the Future?

Is telecommuting the way we'll work in the future? From Inescapable Data:
When we interviewed companies for our book and researched the changes in corporate worklife in general, we became aware of the huge shift to non-office office workers. Many large companies such as IBM and Sun boast that currently 33% of their workforce has no corporate ‘office space’ any longer, possibly heading toward 50% or higher.

Well, sure, Sun and IBM have a lot of telecommuters. But what about others? Rep. Frank Wolf (R-VA) is demanding that federal agencies certify that they are increasing telecommuting opportunities for their work forces, on pain of losing funding (Kansas City Star). This article says that 6% of federal employees already do commute.
Wolf began championing a robust telecommuting program in the government about five years ago as a way to help the Washington area cut traffic congestion and pollution. But agencies have moved slowly on the issue despite encouragement from the Office of Personnel Management and General Services Administration. Many federal managers are not comfortable with the concept or are concerned that employees who work at home may be less productive.


Last year, Wolf directed the departments of Justice, State and Commerce; the Small Business Administration; and the Securities and Exchange Commission to file reports showing that eligible workers are permitted to telecommute.


Wolf has asked the Government Accountability Office to review the telecommuting reports.


Because the House expanded the jurisdiction of his subcommittee, Wolf also would require the National Science Foundation and the National Aeronautics and Space Administration to certify that they provide ample telecommuting opportunities for their work forces. Wolf asked GAO to review their plans as well.


Rep. Wolf is probably looking for ways to trim federal spending on office space and payrolls (so, for example, the pool of federal recruits don't live within daily commuting distance of the federal office). I understand some states have seen fit to modify their tax laws accordingly.
Most notable is a recent case in New York. This spring the New York Court of Appeals ruled against a Tennessee resident, Thomas Huckaby, who telecommuted for a New York company. Huckaby filed New York nonresident income tax returns, allocating his income between Tennessee and New York, based on the number of days he spent working in each state. Because he spent one-quarter of his time in New York, he paid taxes to New York on one-quarter of his income.


However, under the convenience of the employer rule, if a nonresident chooses to telecommute to a New York employer some or most of the time, the nonresident must allocate the income earned at home to New York. The court said New York could tax the telecommuter on one hundred percent of his income.


This is considered the most aggressive tax case to date that targets telecommuters, and it has drawn harsh criticism from the International Telework Association & Council (ITAC), which estimates 44 million telecommuters in the U.S. and sees the ruling as a discouragement for employers to offer telework.

If telecommuting becomes as widespread for other farmable services as it is at Sun Microsystems and IBM (improbable, of course) then we could in fact see a telecommuting workforce that really had 44 million. (And if this diet works, my weight might really approach what it says on my passport!)

What is "EDGE"?

Much of the buzz about PDAs nowadays is that they come equipped with EDGE. This, of course, cannot be Googled since "edge" is not only a common word, you're likely to wind up with a lot of adverts saying this or that device has "an edge," or worse, "Her humor has a real edge...here's what she says about couples who insist on necking in public (or other PDAs)."

I've been using the term PDA (personal digital assistant) to refer to electronic devices that are designed chiefly for storing or retrieving printed information, which are also very small. The most common are the RIM Blackberry, the PalmOne Tungsten and (formerly Handspring, now PalmOne) Treo, and the PocketPC (all links are to image searches). Initially PDAs amounted to beefed up calculators (Sharp Wizard), but now your PDA is likely to come with cellular functionality. Likewise, 3G cell phones refer to cell phones with a lot of the features commonly encountered on PDAs. Hence, it's become routine to hear "GSM," a mobile phone standard, used in connection with a PDA (which, in the early 1990's, would have implied that the speaker was Dilbert's boss).

GSM is the most common mobile phone standard; 70% of PCS subscribers use it and it's mandatory in the EU. However, it has a ceiling of 14 kilo bits per second (kbps) , whereas its designated successor, W-CDMA/UMTS, has a theoretical maximum of 2.3 million bits per second. UMTS licenses have already been sold and companies like Hutchison-Whampoa and DoCoMo are already operating 3G UMTS networks in the EU.

In order to bridge the gap, EDGE was created. It stands for Enhanced Data for GSM Evolution, and is a workaround the technical limitations of GSM. Here's the Wikipedia definition. Basically, this involves converting the square 0-1 signals of a digital signal into a gaussian wave, then offsetting those waves so data can be transmitted at multiples of the notional limit.
In addition to GMSK (Gaussian minimum-shift keying) EDGE uses 8PSK (8 Phase Shift Keying) for its upper five of the nine modulation and coding schemes. EDGE is producing a 3bit word for every change in carrier phase. This effectively triples the gross data rate offered by GSM. EDGE, like GPRS, uses a rate adaptation algorithm that adapts the modulation and coding scheme (MCS) used to the quality of the radio channel, and thus the bit rate and robustness of data transmission. It introduces a new technology not found in GPRS, Incremental Redundancy, which, instead of retransmitting disturbed packets, sends more redundancy information to be combined in the receiver. This increases the probability of correct decoding.

It can carry data speeds up to 384 kbit/s in packet mode and will therefore meet the International Telecommunications Union's requirement for a 3G network
Pre-EDGE technologies included (as the entry says) the less-capable GPRS by increasing signal confirmation data. So now you know.

Labels: , ,

19 June 2005

I've invented something and it's an odd fit-2

(part one)
To put it bluntly, I think it's a straightforward matter that the odds favor doing nothing unless a value-added reseller (VAR; includes PCS companies like Verizon, Singtel or Cingular) demands the product. I think the ideal strategy is to find other inventors in the same field, and present these together as part of a product.


In this case, I think you may need to think about the selling of the invention as part of the invention itself.

This means you have to think about how the invention would be implemented. In the case of Singtel:

Singtel is a service provider. Their job is to provide services to users of cellphones. They buy phones from another company, custom-made, perhaps add microcode, and resell them to users bundled with the PCS. This means they are likely to represent the demand for a new user interface. What I am saying is, your customer is most likely going to be a VAR.

Now, where are these VARs going to be located? If the USA, the language is English and I would argue that your ideal user is not Kristie Midriff, but industrial firms that provide handhelds: Psion Teklogix. The reason is that these are already firms that provide custom devices with specialized interfaces, and they may well need something with a small, compact that they can adapt. This requires you learn about industrial handhelds.

You then take this knowledge and use it to refine your product.

Also, you need to do a lot of research. There are networks like Skype, and I've blogged about them. You need to use them to get in touch with other inventors who are in the same field and have complementary technologies, because that is how you are going to make your patent attractive to others: by finding complementary ones, contacting the developers, and forming alliances. This will require diplacy and knowledge. Do you feel that you are up to this? This is what inventing is all about.

17 June 2005

I've invented something and it's an odd fit. What do I do now?

A friend of mine developed a design to improve user interface with PDAs. It's a pretty cool concept, but it's a very peculiar fit. For an OEM to adopt a new user interface—that's a very important new step. For over a decade now, the manufacturers of PDAs have spent literally billions on trying to make text entry work. The cell phone is gradually merging with the PDA and evolving along the same lines; yet the interface used on cell phones for SMS is not very well suited for it. At the same time, the QWERTY keypad used on larger PDAs like the RIM Blackberry are not very well suited to SMS.* People put up with them, but it's the demand for SMS (and MMS) is driving the growth both in PDAs and in 3G cell phones—not their disappointing user interface.

So you'd think a new user interface would receive an enthusiastic hearing.

Well, when my friend described his idea to me, I was quite skeptical. I strongly suspect that there exists a mathematically ideal arrangement of keys—both the number of keys, and the arrangement of those keys. I think a reasonably efficient team of mathematicians and programmers could find that arrangement in about a week, and I think that arrangement is on file somewhere at Samsung, LG, Ericsson, and Nokia. I also believe that slightly less reliable information is on file with these firms on the cost of these phones, and less reliable information still on the benefits.

What that means is that our typical large 3G/PDA OEM** knows with 100% certainty what the best design is; knows with 75% certainty what the costs of implementation will be; and knows with 25% certainty what the commercial benefits of implementation will be. Each firm has a patent on its version, or can get a patent whenever it needs to, because each firm used slightly different constraints in finding its optimal design. The managers of the firms understand that their estimates are flawed; they know that they err (a lot) on the side of conservativism when estimating the benefits, and on the side of excess when estimating the costs. But statistically, those "errors" are the way to bet, and so they do.
(To be continued)
* SMS: short message service; MMS: multimedia messaging service
** OEM: original equipment manufacturer; VAR: value-added resaler

Labels: ,

15 June 2005

RIM Lawsuit (Part 2)

Part 1 was background to my other email on the NTP vs. RIM lawsuit that's going on now. RIM stands for Research in Motion, a company based in Waterloo, Ontario, Canada; NTP is a patent holding company (i.e., a legal entity created to own and defend patents) based in Virgina.

One thing I wanted to mention, because it seemed to be on your mind, was the lawsuit between RIM (developer of BlackBerry) and NTP (the patent infringement plaintiff). This suit pertained to the transmission of data between individual BlackBerry units and transmission relay units. The lawsuit applied to all RIM devices, and did not affect the keyboard design. Thomas Campana was both the inventor of this technology and the founder of NTP, about which almost nothing is known (all I know is traced back to publicized court documents and the Gartner Dataquest's Todd Kort). Campana appears to have done his work in the early-mid 1980's while at AT&T.

According to Kort, who did the research on the NPT vs. RIM case, Campana's behavior was awfully fishy. He came up out of nowhere in 2000 with this lawsuit, but he had the documentation and stomped RIM legally. I think most technology experts think his case was flimsy, but it pertains to the geographical location of patent infringement.

Campana has sinced died, but NTP naturally continues the fight.

Anyhow, RIM settled and its stock price soared. Unfortunately,
Forbes/New York Times: "As Patent Deal Unravels, Anxiety Rises at BlackBerry Maker"

For three and a half years, patent claims by NTP, which is based in Arlington, Virginia, have been a cloud over Research in Motion, based in Waterloo, Ontario. But the announcement last week that a $450 million agreement reached in March was unraveling came at a particularly delicate time.

In a court filing that followed, the privately held NTP indicated that if the settlement cannot be revived, it plans to invoke an injunction banning sales of BlackBerries and their e-mail service throughout the United States.

That injunction, which was put on hold during an appeal by RIM, has grown more powerful with time. After years of having the wireless e-mail market more or less to itself, RIM now faces competition from hardware makers like Palm Computing and software vendors including Seven Networks, Good Technology and Visto.

For now, a shutdown of Research in Motion in the United States, where the company gets about three-quarters of its revenue, is far from certain. But the renewed legal uncertainty is almost toxic for some members of the investment community
I am not competent to say if this is a serious possibility or not. But RIM is one of the biggest names in PDAs.

RIM Lawsuit (Part 1)

I thought I blogged about this before, but it turns out it was all in a private email to a friend. Here is what I wrote in the email:

April 21, 2005

If you're still interested in RIM's legal battles—please notice computer companies are all about the legal battles. This article in Forbes below is kind of old (Nov '02) but keys you into the sort of fighting that goes on in the PDA business.

RIM sues Good Technology, Inc. over the latter's synchronization technology (GoodLink) software, which it offers as an upgrade for RIM handhelds. Its main difference from the Blackberry software is that it allows over-the-air synchronization of contact lists, appointments and other information in addition to the staple of wireless e-mail.

RIM sues Handspring (now Palm) over its use of BlackBerry keypad on the Treo; then settles for a big payout, causing Handspring stock to soar (just as RIM stock soared when it settled).


JRM

(Forbes article follows)

FORBES Mobile Computing, Royalties In Motion
Arik Hesseldahl, 11.07.02, 3:57 PM ET

NEW YORK - What's good for the Palm operating system is now good for the Blackberry.

Against a backdrop of lawsuits targeting competitors, Palm (nasdaq: PALMD - news - people ) today agreed to license a patented keyboard design from Research In Motion (nasdaq: RIMM - news - people ). Financial terms were not disclosed.

RIM's attorneys have been on an aggressive streak lately. First came a series of four lawsuits against Sunnyvale, Calif.-based startup Good Technology. Good sells software called Goodlink that allows it to sync up over-the-air with a PC running Microsoft Outlook. Goodlink runs on RIM's Blackberry devices.

Last month, a judge in Orange County, Calif., denied RIM's request for a temporary restraining order that would have prevented Good Technology from selling the software, but the four suits are pending. (The lawsuits, however, have had an unintended effect: They put Good Technology on the map, generating a lot of press that led to new customers. Good Technology says it has sold its software to about 400 companies.)

A lawsuit against Handspring (nasdaq: HAND - news - people ) followed, this one specifically aimed at the keyboard on Handspring's Treo devices, which not only have e-mail capabilities, but double as mobile phones. The Handspring suit was settled in RIM's favor earlier this week for an undisclosed amount, which helped goose Handspring's share price back up above $1 for the first time since late September. The Treos, like Palm's Tungsten-W, do a fair job of resembling a RIM Blackberry, which may have gotten Palm to the negotiating table today.

Labels: ,

Data Freeway

FTP stands for file transfer protocol. It's the standard for sharing files (of any format) between different operating systems. It allows a webmaster to communicate with the server hosting her homepage.

FTP clients
are programs that allow a webmaster to upload or edit the files of a web page. Unlike "normal" files like your term paper (that you wrote in MS Word), the files that are incorporated in your webpage reside on another computer—an FTP server. FTP servers are also called hosts. You are only going to need an FTP client if you have a web page. Even then, you may not need one: this site can be maintained without an FTP client. Most personal web publishers, like Nucleus CMS, Movable Type, bBlog, WordPress, b2evolution, boastMachine, Radio, and Drupal* have file uploading built in. Likewise, Macromedia Dreamweaver has an FTP client built in.

However, these are often inadequate. Movable Type only allows one to upload files; taking them down or editing them, or re-arranging the file system (like, for example, putting images in subdirectories) is impossible. I've never used the other publishing CGI applications, so I can't comment about them.

For those of you who—like me—always want free stuff, there is a free download of an excellent FTP client available:

So now you know. I've been using this one for several months and I think it's superb.
* List of web publishers and links via Wikipedia

Labels: , , ,

They built a better Paint—and it's freeware

Dear Readers, I now have to make some remarks for fellow Windows users.

Windows comes bundled with a software called Paint. Click the start button and select programs, then select accessories. It should be there along with the DOS shell and Wordpad. If you've used a lot, as I have, you know it has some serious limitations. The worst limitation is the way it saves JPEGs. Basically, if you open a bitmap file and save it as a JPEG, it looks like—er, uh, it looks terrible. Suppose the bitmap is the Japanese naval ensign. In BMP, this is a circle of solid red on a field of solid white. Save it as a JPEG, and there's a mist of tiny reddish ripples leaking into the white. Immaculate faces look like they suffer severe acne or scarring.

MS Paint's GIFs are about as bad: that BMP of a Japanese flag now is a crisp circle of solid muddy grayish brown on a pure white field. At least we can't mistake it for the flag of Bangladesh!

Imagine my joy to discover there is a better way: Paint.Net.

Paint.Net is freeware and you can download it here. When you save JPEG's, you get to chose the level of quality (on a scale of 1-99%). If you pick 95%, the image quality is still quite satisfactory, and the file is about an eighth the size of a BMP. The links to graphics of flags are to GIFs, but the colors are true.


Screencapture of Paint.Net (click for larger image)

Another reason to download this program is that the tools for editing files are vastly more powerful. A lot of the effects in Photoshop, for example, are there in Paint.Net.

One feature I would love to see them adopt, however: rotating an image, or a selected part of an image, an arbitrary number of degrees. A lot of times I just to to tilt something 5 degrees.

UPDATE (10 March 2007): I added the illustration above. Also, please note that the current version of Paint.Net has a command, [Shift]-[Control]-[z] (or Layers | Rotate-Zoom), which does indeed allow one to rotate the image or any part thereof an arbitrary number of degrees. One can also tilt the plane of the image, as shown below.

Same image as above, "tilted" backward

Labels: ,

Mac & Windows

I'm going to have to admit that I do everything in MS Windows. For years I dreamed of a Macintosh, but put aside those dreams when Steve Jobs ended licensing of the Mac OS, and insisted on a monopoly of the proprietary operating system. Since then, the user base of Mac OS has dwindled as a share of the total, and Job's focus has been on Mac as an expression of profound individuality. However, I have observed repeatedly that there is an awful lot of really good creative blogging out there being done by Mac users. I've noticed that the Mac user base, far from being the group of technical novices I would have expected, is actually more technically competent and shrewd than the Windows user base is.

I still have this big ugly pool of frustration with Steve, because I think he spent so much energy marketing Apple products as fashion accessories, but I've gotten older and increasingly I recognize that Apple products have retained a crucial technical edge over Intel ones in many respects. The anarchistic piece of patches and bugs that is Windows essentially uses up all of the skill and talents of coders. The hideously dysfunctional business climate for Windows software developers consigns entire cohorts of coders to arbitrarily-inflicted obsolescence. There's a great Japanese word for this, muda, which IMO describes about 75% of the effort in Windows development. Here we are, twenty years after the development of the GUI, and we're taking up Linux as the alternative to Windows. The fact is that Windows is not a programming environment, it's a cross between the Cold War and the Bosnian Civil War. The Mac OS community is, by this analogy, a stable productive society. So Steve handled the technical issues successfully.

As to the marketing: I have a really tin ear. My ability to anticipate the effect on the public of a certain thing, like the presidential debates, is so poor I've had to give up on the exercise entirely. The sort of reasoning that win arguments nowadays is not my reasoning.

That's where we're going with this: the market of ideas and products as a programming environment—a civil OS, as it were. I'm used to thinking of operating systems as a bunch of specialized code that you install on a computer so the damn thing can open Word files. But it turns out operating systems are a metaphor for society. The components of the program (institutions) work by collecting what we know about pieces of data (individuals) and associating that data in ways that serve the survival of the institution. The system files mostly evolved in different places, and they don't necessarily work together very well together, but it's much to late and too difficult and too controversial to revise them so they make any sort of sense. There's also the core of the OS, which gradually absorbs more functions of the system software (software drivers, social welfare systems and child protective services) but there's always going to be things that the OS cannot do, that have to be done (things that the state cannnot do, that require spontaneous or traditional associations).

And I notice that the programming API (the standards that applications have to fit into in order to run in a programming environment) for our society is set not by political philosophers like John Locke or Thomas Aquinas, but by fashion. This is not a terribly original idea, but it's something I keep overlooking. Karl Marx, John Stuart Mill, W.E.B. DuBois, and Joan Robinson came up with really compelling analytical models to explain how societies interact with their institutions—yes, they did. But these models are terrible at predicting the future, and people who understand them become worse, not better, at reading the responses of their neighbors.

The Mac OS deserves to beat out the Windows OS because it works like a properly engineered design. It's a more effective, waste-avoiding strategy of doing what programmers and computers are supposed to do, than Windows ever will be. It spawns creativity; the anarchy of Windows, as we've seen (and seen, and seen, and seen again) spawns drudgery and frustration. This is not because Windows programmers are stupid; nor do programmers code for Windows because they're masochists. They do it to earn incomes and get their ideas on people's desktops. Windows is the world in which we live; Mac OS is an enclave of sanity. It's like a tiny little corner of Bosnia where Serbs, Croats, and Bosniaks live in harmony. The rest of the country is chaos, and that's the world of computing: massive waste of potential because of chaos. People never actually chose chaos, but that's what we get. So we plan for it, and we assume it.

And that's why fashion sense—a system of hierarchy that explicitly repudiates any logic—is the law of the jungle. Jobs understands that. Why did I not see this before?

Labels: , ,

On a Foray into HTML-4

In part 3 I mentioned Java and JavaScript and explained how they are different. Now I need to introduce a difficult concept. I mentioned that Java was developed by SunSoft/JavaSoft to be readable by all computers, regardless of operating system or browser flavor. That was through something called the Java Virtual Machine (JVM). Java itself can be used to write any application you like; there are word processors written in Java, databases, and so on. A huge benefit is this: suppose you have a cool program of scientific or intellectual benefit, and you want everyone to be able to use it. You can write it in Java, and anyone can run it, even if they don't have the program installed on their computer. No one has to configure anything, or wait for 20 minutes to launch the installshield, or reboot the computer, or even open another window. It just works.

There is another kind of web-delivered application called common gateway interface (CGI). CGI programs are frequently written in PERL, and they are not designed to run from anything like a JVM. If someone comes to your website, they aren't going to be able to access any of the programs you have hosted in your server's cgi bin. CGI applications fill a need for a program that runs from the cgi bin itself, and whose output is HTML/JavaScript. (NOTE: for you linguists reading this, OOC, C++, Java, and PERL are all derivatives of ["hacks"] of C; whereas C enabled the Internet, HTML, PERL, and Java enabled the Web)

Code written in PERL is called "script." The most common CGI application is personal web publishers, like Movable Type and TypePad. An example of a CGI program is the one implementing a wiki: you hand it the name of an entry, and it will retrieve the source of that entry's page (if one exists), transform it into HTML, and send the result back to the browser. Or tell it that you want to edit a page. All wiki operations are managed by this one program. CGI applications are also used to administer databases. Google is a CGI application.

The one big thing that a CGI application does is generate HTML. For example, suppose you need to register visitors so you can distribute secure data online. This means you need a template for a form (like a form in MS Access); a query, that retrieves the data stored in the backend of the database based on whatever the user put in the form; and reports, which may include an invoice, directions to St. James Cathedral, or an encyclopedia entry. The CGI also requires templates, which are mostly HTML plus some PERL script that actually inserts the retrieved information.

In addition to the language used to create CGI applications (PERL), there is the application that people use. Frequently the application is customized; for example, Wikipedia is an encyclopedia (a database back end) and the application that delivers that data (the database front end). Google is another. However, there are a few programs you can get to install on your webpage that are analogous to MS Access. I already mentioned personal publishing (blog) software; not surprisingly, most bloggers are not technically inclined, and even those who are have no desire to code their own publishing software! So they download it from BigNoseBird, or other sources.

Macromedia Dreamweaver is a software that allows one to create a web page with no HTML knowledge (or at least, very little). Also, Dreamweaver allows one to create very simple CGI applications with limited knowledge of either CGI or PERL.

Labels: , , , , ,

14 June 2005

On a Foray into HTML-3

SunSoft Java[*], and Netscape JavaScript [*] are closely related ideas. They're both programming languages that are commonly associated with the internet. The similar names are just a coincidence, however, and they refer to very different things. In this blog post and others, I'm going to refer to a program and its elements as "code." You could say programs are written with code. I also will use a term, "compiler." This is a program that reads code written in a high-level language and translates it into assembly language so the computer can do what it's supposed to do.

Java was developed about the same time as Mosaic, the first Web browser. Most computers supplied since 1990 have a "Java virtual machine" (JVM) that is a compiler for Java code. This allows Java code to be read by any browser anywhere, any time, regardless of the computer on which one is browsing the web. The VM is common to all browsers, regardless of flavor (this is not STRICTLY true!).

An application is any program that you need a computer for, such as word processing or managing a database. An application written in Java is called an applet. An applet can do pretty much anything that a conventional application can do; so, for example, this list of applets includes calculators, graphers, simulators; an MP3 player; chat rooms, email programs, and spam blockers are also written in Java.
What about JavaScript? JavaScript was created by Netscape as a simple set of commands that all browsers would recognize. Unlike Java, which is a completely separate programming language, designed for autonomous applications, JavaScript is a set of commands recognized by browsers. JavaScript programs, or scripts, are usually embedded directly in HTML files. The script executes when the user's browser opens the HTML file.

JavaScript allows the person visiting your website to interact with the site. A simple script involves letting the visitor select the background color of the page. Another script can detect the user's operating system and browser type, then give instructions that are appropriate to the user's particular computer. A third type evaluates user input. Drop down menus and combination bars are things that you can do with JavaScript.

(To be continued.)

Labels: , , , , , ,

On a Foray into HTML-2

This post has been edited for accuracy

So, to recap: the Web and the Internet are similar and it's reasonable for people to use them as synonyms. It's just that the Web is what individual computer users have created with HTML, in the medium of the Internet. The Internet is older; it's th foundation and building material of the Web.

Web pages are created with HTML. This simply a file type that can be read by a browser. Web pages are "made" of HTML; HTML is a high-level computer language that explains to the browsers visiting the site how to display the text and images hosted at the website.

In addition to the HTML files that the browser reads, there are elements that the browser is told to display. Web browsers are designed to "read" (recognize the format and display accurately) JPEG images (*.jpg), GIF images (*.gif), TIF (*.tif), and bitmaps (*.bmp). They can also recognize other types of files, which I'll describe in a moment.

In addition to HTML files, the above-mentioned image files, and Java or Javascript files, you can post pretty much any type of file you want on your website. However, in order to read things like an MS Word document or Acrobat PDF, you need to have the software installed on your computer. Hence, the popularity of Adobe Acrobat. The software for reading PDF files is free; people pay to buy the software for creating *.pdf files. These files will display in a new window of the browser, or a window spawned by the browser's computer (i.e., Windows or Mac OS will launch MS Excel if you open an Excel file at a website).

WHAT ARE SOME OTHER FILE TYPES YOU CAN HAVE?
You can have MPEG's, which are files that are either audio, video, or both. MPEG refers to a standards committee (like you needed to know that!), and this committee keeps issuing new formats. MPEG-2 is the standard used for most *.mpg files. A variant is MPEG-4, which was modified to create the Windows Media Video (*.wmv, or "Wave") format; Apple Quicktime (*.mov) is a third. These file types can be created by different softwares, and they can be played back by freely ditributed playback software. Like Adobe Acrobat, the player is usually free, and the computer's operating system must spawn the player for it to be seen. The file formats are mutually incompatible, although some players can play more than one format.

In addition to these, there is Macromedia Flash/FlashPlayer. This is like the others, except that Flash allows one to create a digital image by manipulating objects in the Flash software; it's like MS PowerPoint, with the ability to animate the presentation and upload it to the web. Flash files (*.swf) are typically viewed as an animated graphic within the web page; it's not usually necessary to spawn a new window for playback. As a result, one can combine animated and non-animated elements in a single page. Also, Flash is very easy to use, in my opinion.
COOL STUFF I NOTICED LATER: Here's a blog post about new features available in the latest release of Flash (hat tip to Wikipedia's Flash entry).
(To be continued)

Labels: , , , ,

13 June 2005

On a Foray into HTML

Some terms of art for the web:

Some of you are going to hear some technical language used here that is quite intimidating. A case in point is the jargon associated with web pages, the internet, and so on. The fact that many of these terms have multiple meaning doesn't make it easier, but let us hope this does.

First, many people surfing the internet may be a little confused by the terms, "internet" and "web." These are almost, but not quite, synonyms. The internet is a network of networks that is connected (at least initially) through the telephone lines, using signals much like voice transmission. Modems used a universal standard for exchanging data through the phone lines, called TCP/IP. This format was developed in 1969 though the Advanced Research Projects Agency (ARPA), a branch of the Department of Defense. Much later, a protocol called HTML was developed that allowed web browsers to treat data sent over modems and convert this into graphical images, such as a web page. At the same time that HTML was invented, web browsers were also invented by the National Center for Supercomputing Applications (NCSA). It's easy to see why browsers and HTML had to be invented concurrently: a browser had to be able to translate data from a modem into an image that could be displayed, and there had to be a standard that allowed browsers to speak to each other.

The internet was initially useful to computer terminals connected to mainframes, running arcane software like FTP, Usenet and Gopher. I recall having a lot of friends who were familiar with these services and talked about them a lot, and finding it inconceivable that these things would ever amount to anything but costly nerd toys. In 1992, however, Mosaic emerged as the first graphical browser, thereby creating--in a stroke--the world of interconnected hypertext we know as the "Web."

(To be continued.)

Labels: , , , ,

12 June 2005

Hands-Free Bluetooth for Motorists

Via PDA Buzz, news of a hands-free Bluetooth phone for drivers. The company is Parrot, and we're interested in this because it's an example of intimate computing with [potentially] one type of input: the voice.* It plugs into the cigarette lighter of your car and sprouts upward like a chanterelle:


Click for image source



For those who complain that the lower part looks nothing like a chanterelle mushroom, visualize the lower part plugged in (and in about six months, when you'll be able to buy it in orange). The picture at PDA Buzz looks more like one of those fungi that pop out of the sides of trees up here in the Pacific Northwest.
The DriveBlue was very easy to Pair with the Treo, after I read the directions. All Bluetooth kits and headsets I previously tested had the simple pairing code of 0000. The DriveBlue uses 1234 which I would have known if I just looked in the manual. One nice feature was that when the DriveBlue was turned on for the first time a voice came through the speaker saying "Please Pair the Device." Once paired, it was able to function as a true Hands free kit with the Treo. That means that calls could automatically be routed to the DriveBlue. So, a call comes in and two rings later it automatically is picked up by the hands free kit. The voice was loud and clear coming our of the speaker. Two large buttons (green and red) can be pushed to end a call, put a call on hold, or to voice dial a contact. (sadly, the Treo 650 still doesn’t support this feature yet.)


* The reason why I say, "potentially one type" is that the model they have does, in fact, have buttons for activation. Also, there's not yet a feature for voice input of data into other devices it might get paired with. You might say, "JRM, those are pretty big 'buts'." However, these features exist separately (voice recognition of commands, for example); I expect it's a matter of very little time before these are combined to make the DriveBlue truly hands free.

Labels: , ,

Vonage, Inc

Vonage is a medium-sized company that offers online telephony. What is that? Basically, when you go online like I am now, you use the modem to call your ISP, who charges you for a local call (or less). Then the ISP's computer links to other servers that host web pages (like Google) or, through email, other computer users. Communication between servers is cheap—so cheap it makes no difference if you're visiting a website in Bangalore or in Spokane. Likewise, sending an email to someone in the RSA doesn't cost more than sending one to someone in Denver. Even if you live in Denver.

It may seem odd, but it's taken until about two years ago for it to become common for people to use the internet to make telephone calls over the internet. For years the idea was blocked by the long-distance telephone companies. Then, a firm called Vonage offered its own E911 service (a private 911 service) ; after this wedge market was in the door, Comcast and TimeWarner launched telephony services as well (Forbes). Vonage soon offered overall telephony everywhere, at reduced rates (the Register). I was awfully ashamed to have not heard of it, but it has sales of only $50 million and 600,000 subscribers—it's pretty new. Still, Vonage uses regular handsets, in contrast to Skype, which requires computers with internet access.

Labels: ,

11 June 2005

What is Skype?

Recently I've been hearing references to an intimate computing device called "Skype." This is a software that allows one to use an internet connection to make telephone calls. People who are "Skype-enabled" can therefore call each other long distance for very little. The software is downloadable for free, and I paid a visit to the "Skype Journal" expecting to encounter a field of astroturf.

Apparently, there is some other social network involved, comparable to Ryze, Ecademy, Linkedin, Tribe. Ryse seems to be like a directed version of Friendster, in which people give you permission to join the community, and you then link up with others who presumably trust you a little more because you're a friend of a friendster. Ecademy involves a networking arrangement of business people. Skype has a downloadable software that allows telephony among members. Skype likewise has a "community," which is visible mainly as a group of forums.

Technically, Skype—a company based in Luxembourg—implements a proprietary version of "voice over internet protocol" (VoIP) which it developed as a commercial monopoly. Competing with this are other versions of VoIP, such as the freeware, "open" standards SIP, IAX2, and so on, that could potentially lead to format wars. Skype's strategy for winning this war includes allowing members of the Skype community to talk to each other for free (one can still use Skype to get very low rates talking to non-Skype members).

Who are these people and how do they make ends meet? It seems Skype is a company with a business plan similar to Adobe's (with Acrobat): get PDA users to use the internet to mak telephone calls, then offer a premium service to users who need to reach non-Skype phones (Science Daily).
A free download of Skype allows users with Internet connections to make free VoIP calls to other users of the program. In its first 18 months of existence, the company, based in Luxembourg, claims to have enrolled 41 million users, with an average 150,000 new users joining each day and a total of 118 million downloads of its software.

SkypeOut — as the company's current premium service offering is known — has 1.5 million registered users and allows access to traditional telephone lines at an average rate of 2 cents per minute.

Now, SkypeIn will give users a telephone number to receive calls as well. It costs $39 per year and shifts the company from its roots as a peer-to-peer service to a commercial service.

Despite Skype's rapid growth, it may not become a company to rival Vonage, the broadband telephone carrier, or traditional long-distance competitors, according to some industry analysts and even the company itself. Skype only recently created an option for billing its customers through PayPal. It also does not currently offer any live customer support or emergency services.

Anyway, more information is available on this Reuters story:
Skype's business plan has been to offer its basic service for free and then charge for additional services. But Zennstrom said the company has intentionally given developers free reign, even if their offerings compete with Skype's own offerings.

The privately-held company made a crucial decision early on to open its API — a set of protocols and routines that coders use to build new software applications — which allowed developers to write their own applications that fit neatly together with Skype.

The move involved surrendering a certain amount of control over how Skype is used. Indeed, some of the add-ons, such as "answering machine" software and a video conferencing application called Video4Skype (http://www.video4skype.com/), bump up against some of the products that Skype itself plans to offer.

I'll address terms of art like "podcast," and who is Vonage, in a subsequent entry.

Labels: , ,

10 June 2005

What is Dynamic Programming? (2)

Part 1

Often when you're designing something you need to find the maximum value, given some constraints. For example, suppose you are trying to decide the optimal lot size of inventory. You have an inventory of (say) quart containers of SAE-400 motor oil, which at any particular time is I(t). The annual sales amount of this item is A, and you order it n times a year:
A = nx
Well, holding inventory is not cheap. You have a holding cost and an ordering cost. The average stock of inventory is x/2, and the cost of holding one unit of inventory is Ch, while the cost of placing an order is C0. The method of solving this sort of problem is a lagrangian, and it can be applied to selecting the number of, say, extra special-purpose keys that you want on a small electronic device (like, for example, a customized keypad). Adding more keys will not be cheap; on the other hand, it can definitely make the device easier to use. Also, you can add variables to make the problem capture as many variables as you need to analyse.

That's "classical programming." In the case of "nonlinear programming," you're working with an inequality constraint. That means, your ideal solution does not have to be a point on the constraint function; it can be less than (greater than) the constraint.
One of the greatest challenges in NLP is that some problems exhibit "local optima"; that is, spurious solutions that merely satisfy the requirements on the derivatives of the functions. Think of a near-sighted mountain climber in a terrain with multiple peaks, and you'll see the difficulty posed for an algorithm that tries to move from point to point only by climbing uphill. Algorithms that propose to overcome this difficulty are termed "Global Optimization".

The word "Programming" is used here in the sense of "planning"; the necessary relationship to computer programming was incidental to the choice of name.

My textbook (a very old one) includes the problem, "a group of N persons own a square lot and plan to build their homes on it... They would like to ensure that the minimum distance between the centers of any two houses is as large as possible." (p.67, Intriligator). Again, nonlinear programming can be used to enhance designs and confirm that they are optimal, given the restrictions known to exist.

Labels:

07 June 2005

Intel inside Macs? Oh, the humanity!

First, United Airlines has teamed up with Verizon to offer Wi-Fi to passengers (hat tip to Digital Lifestyles). What this means is that UAL has gotten FAA approval to install Wi-Fi on Boeing 757-200's, which remain a rather small part of the UAL fleet. When can you start updating your blog over the Pacific Ocean? According to Forbes, UAL still needs FCC permission; a firm date is expected by August, and the launch is presently scheduled for sometime next year.

Also big news, to me: the heartbreak that Apple Macintoshes are switching over to Intel (i.e., they're dumping the IBM PowerPC chips). This is really huge news, and very deeply disappointing. For years Power PC's not only powered Apple computers and blazed some very bold trails in semiconductors, they also powered IBM workstations, minicomputers, and parallel supercomputers. I was really excited when the PowerPC hit the streets' in the mid-1990's because I was expecting it to serve as the bridge for topflight workstations running Windows, MacOS, and supercomputing applications. None of this panned out, and after Jobs returned to Apple as chief executive (and scuppered the NextStations), he pulled the plug on licensing the MacOS. Mac became the costly, "cool" computing platform, and your correspondent the destitute dorkwad, disconsolate, bought a PC.

The PowerPC was developed jointly by then-CPU titan Motorola, IBM (from its in-house POWER chips, which used RISC architecture but ALU's that could calculate scare roots in a single clock tick), and Apple. IBM does continue to support the PowerPC in its systems, and another big user is the auto industry—to control emissions systems—but in the future, the big consumer of the PowerPC will be the video game industry, with its voracious demand for graphics. The PowerPC was the first chip to feature an integral digital signal processor.

Labels:

06 June 2005

Wireless Subscribers Set to Grow

InStat (via email):
Worldwide, wireless subscriber growth is experiencing robust expansion after several years of slower growth due to the economic downturn of the last few years, reports In-Stat (http://www.in-stat.com). By 2009, the high-tech market research firm forecasts the worldwide wireless market will grow to more than 2.3 billion subscribers. There will be no relief from the ongoing battles for airlink supremacy over the next several years.

"GSM's steady growth through 2007 will turn negative as operators move subscribers to third-generation (3G) WCDMA," says David Chamberlain, In-Stat Senior Analyst. "While the second-generation GSM system (including GPRS & EDGE) will remain the dominant airlink throughout the forecast period, CDMA airlink standards (CDMA & WCDMA) will soon encroach on GSM's numbers. By 2009, WCDMA networks will be providing service for over 40% of the world's CDMA users."

Two points:
  1. Ever notice how confident analysts are of projections over the next four years (rather than the next 1 year)? I understand that sales are assumed to vary as a log of previous years sales, and the log is assumed to regress to the mean in any given sales period, but that's not how sales growth behaves.

    In Butterfly Economics, Paul Ormerod explains how major variations occur in consumer behavior by draing an analogy to ants seeking out a food source: ants respond to signals by each other, that tend to cause tiny variations to snowball into group behavior. Projections involving logs that revert to trend may describe vast, integrated economies like that of North America, but in an industry like cell phones, if demand doesn't grow, capital investment and employment will gradually be redirected and the growth horizons for that industry will shrink.

  2. Another point: while I find the industry growth projections a bit suspect, the bit about the GSM standard headed for decline is rather interesting. GSM is a mandatory EU standard; in North America, GSM competes with CDMA and W-CDMA. D-AMPS is another standard used in North America that is being phased out entirely. According to the Wikipedia articles cited, W-CDMA is widely used in Northeast Asia, albeit under sometimes proprietary standards (e.g., DoCoMo's FOMA, the most popular system in Japan).

    The EU tends to amplify its influence on the wireless world by insisting on cohesive action; all wireless PCSs in the EU are required to conform to a single standard. Other regional markets tend to have standards driven by industrial alliances. At the moment, GSM is extremely widespread, but it plans to shift everyone to the far-faster UMTS soon.

Developing and imposing a universal wireless standard is very hard. Only the EU's ETSI has ever succeeded, with GSM (1991). When developing a format, it had to
  1. Use technology that was readily available to European firms
  2. Use technology that would not impose any sort of dependence on a US-based firm, such as Qualcomm (EU bodies have an extreme ideological antipathy to the USA)
  3. Ensure that this technology can be implemented cheaply, thereby restricting opportunity costs to EU firms and consumers alike
  4. Ensure that this technology can sustain massive future growth in capacity.
  5. Win the cooperation of major EU telecom firms or IT firms
No American body has ever achieved such success, leading in some respects to a stagnation of wireless technology in the United States.

GSM's success had major economic implications that transcend such matters as European domestic private investment, X-efficiencies in the European economy, labor market "flexibility," or other common economic indicators. Had GSM been a poor design, firms like Nokia and Ericsson would have remained interesting, but obscure, technical start-ups. If it had integrated Qualcomm technology, the infant industry protection afforded by the EU and ETSI would have been compromised; revenue streams would go out of the EU, and Qualcomm would have bargaining power against its rivals in the EU common market.

The development of UMTS is an interesting twist. A crash program of development was launched by ETSI in 1992, the same year that all EU member states implemented GMS. UMTS, the EU version of CDMA, was finalized in 1999 and scheduled for implementation in Jan 2002 by the latest (the same time as the euro). While UMTS is definitely European (in fact, it's a derivative of W-CDMA, which was developed by the ETSI to be patent-wise independent of Qualcomm's CDMA), it was pioneered by Hutchison-Whampoa (of Hong Kong), DoCoMo (a former division of NTT; Japanese). The engineering for implementing the technology was very much dominated by NEC, NTT, and Siemens (Germany).

In the USA, the competing standard was CDMA2000. This standard was developed by Qualcomm, and implemented first by KDDI of Japan (2002). I should mention that GMS is big in the USA as well; it's just that non-GMS phones were effectively banned in the EU. In both cases, the principle technology developers are relying on licensing their way to global dominance.

Labels: , ,

Fire Wire

What is FireWire? According to the Apple Website:

FireWire is a cross-platform implementation of the high-speed serial data bus ...that can move large amounts of data between computers and peripheral devices. It features simplified cabling, hot swapping, and transfer speeds of up to 800 megabits per second (on machines that support 1394b).

Major manufacturers of multimedia devices have been adopting the FireWire technology, and for good reason. FireWire speeds up the movement of multimedia data and large files and enables easy connection of digital consumer products — including digital camcorders, digital video tapes, digital video disks, set-top boxes, and music systems — directly to a personal computer.

Standards have to be defined; there's an association of electronic engineers (IEEE) that develops these things, and they issued a series of articles that said how FireWire would work (1394a & b). If you're interested in a detailed technical introduction to FireWire, click the link above and scroll down to the end of the article for more links.

For those of you like me who just need to know the significance: there are two ways of transmitting data among microchips, serial and parallel. In serial, the bits flow along a single corridor, one after another. In parallel, there is one lane for each bit in the "word" and so a 64-bit computer (like the Intel L2 Pentium 4 or AMD Athlon) communicates with a 64-lane freeway. Wireless communication, of course, doesn't really allow for parallel transmission of data; it requires serial communication.

This was a big deal for Apple, whose Macintosh line uses serial pots for most peripherals. Apple was the main developer of FireWire, but unfortunately, stumbled somewhat over licensing the technology (Wikipedia). Since the 1990's, and the demise of SCSI, FireWire has made a comeback. This may have something to do with Apple's continued involvement (control) over the development of its peripherals, which has gotten it more and more closely connected to Sony and the consumer electronics industry.

If FireWire becomes the most popular format for serial buses, I imagine those devices not using it would have to convert rather quickly, since the market is less tolerant of oddball formats than it used to be. This would be a small, but important victory for Apple, Sony, and the consumer electronics mode of computer use.

Labels:

05 June 2005

What is Dynamic Programming?

Dynamic programming has nothing to do with computer programming; the name was coined in the 1940's by the American mathematician Richard Bellman in the 1940's back when "programming" did not refer to the business of writing computer code. Basically, DP was developed as a tool for finding the best path for an activity. The definition of "best" depended on context. Most of what I know about the subject comes from Mathematical Optimization and Economic Theory by Michael D. Intriligator (1971, Prentice-Hall).

Some of you will want to brush past me; here are three online resources on DP:
  1. "Dynamic Programming," David B. Wagner (courtesy Wikipedia)
  2. "A Tutorial on Dynamic Programming," Michael A. Trick (courtesy Wikipedia)
  3. Lecture Notes on Optimization, Pravin Varaiya, Chapter 9 (1971; 1998; PDF; courtesy Alex Stef).
Prof. Varaiya's link is a math book, so be advised; it's in PDF and very well-done. The part about DP is on PDF-129 and following.

In the meantime, I'll be discussing a few concepts as I skim them in this textbook on my lap.

First, a note on the word "programming": this is a mathematical term initially applied to the problem of choosing values of certain variables so as to maximize or minimize a given function, relative to constraints. For example, suppose you can enclose a yard with a fence of finite length. The area is a*b, but the fence cannot be longer than 2a+2b=C. This is a pretty basic problem in calculus. You can use much more complex versions of this problem to find the optimal shape of a part used in a machine, or the optimal path of a rocket being launched into space.

(To be continued)

Labels:

04 June 2005

What is DMB and why is it trying to crawl into my ear? (Part 2)

Part One

The Republic of Korea, and to a somewhat lesser degree, Japan, have adopted widespread usage of DMB in order to allow massive concentration of wireless activity. Cellular telephones in the RoK already are incorporating 3G features that require broadband connections; the lines beteen personal computers and PDAs have faded, leading to a youth culture in the country that is highly tech-savvy. Radio broadcasting has become audio-internet in NE Asia; television is on its way to losing the passive "idiot box" orientation of its first 50 years, and becoming more like the internet.

Some of this is, of course, overhyped, and it's easy to get swept up in the anticipation of an arcadian future for these countries, a sort of NE Asian "School of Athens." Koreans are not going to suddenly abandon game shows for historical epics, merely because producers have pay-per-view. However, the technology is likely to enable radically change the way people consume entertainment. For one thing, the choice will be far greater, and the real demand for "content" is going to not be, as it as before, advertizers; now it will be viewers. My guess is that consumers will pay more attention to programs, since they will pay for them.

The old arrangement under which a TV entertainment consumer videotaped programs while away, then watched them at leisure, is going to be replaced by a server in a warehouse responding to signals from consumers. The consumer will have the PCS transmit the movie or episode of a soap opera

What is DMB and why is it trying to crawl into my ear?

Finding references to digital multimedia broadcasting (DMB) is easy; there's been a plethora of new products released with the format (Google News search, Technorati), and already it's assumed I know what it is (Insert exasperated expletives here). Well, thanks be to the Almighty and In-Stat, I do ("DMB in Korea," Jan '05; PDF). Basically DMB is a type of media transmission format developed in Korea based in part on the European Digital Audio Broadcasting (DAB) initiative. I won't go into Eureka 147 and its development, except to mention that the MP2 and MP3 audio formats emerged from this research project. In the USA, DAB has been hampered by the competing formats (TSSF).* In Japan and Korea, DAB has been overtaken by DMB.

One virtue of DAB is that radio receivers in a car or airplane don't need to be retuned as one moves from point to point; also, reception is consistently high. The signal is transmitted either by satellite or by terrestrial stations; this is also true for DMB (T-DMB for terrestrial; S-DMB for satellite).

DMB was an initiative launched by Korea around 2002, and quickly adopted by Japan. Both national initiatives favored S-DMB, which involves less up-front investment. The InStat report intimates that the initiative was taken by industrial associations in the RoK, and taken up by counterparts in Japan. DMB differs from DAB mainly in the compression formats, which are essentially higher density. As a result, DMB-configured devices can operate in a more crowded environment than DAB devices can, and the types of data are more varied.

A drawback of DMB seems to be costs of implementation. In order to implement T-DMB outside of greater Seoul, for example, all television transmission has to be digitized; analog transmissions interfere with T-DMB signals. In fact, this changeover was part of the original plan, since Korean wireless and wideband usage is the highest in the world.In my next post, I'll discuss the cultural effects of DMB.

*Curiously, US laws make it far costlier to run a radio station in digital format than in analog format. According to the Wikipedia article linked above, if a song is transmitted using analog technology, then the station pays a royalty to ASCAP, BMI, or SESAC (which represent the songwriters); if the song is transmitted digitally, then RIAA (which represents the performers) must also get a cut. It's rather astonishing how prohibitive US laws are against new consumer technologies generally.