By admin in Distributed Computing
Don’t let the old cattle barn at Alden Electronics fool you into thinking this is just another sleepy New England farm. The garden of satellites next to the silo is a dead giveaway that Alden is doing more than pushing tractors in a field. In fact, this marine electronics and weather data distributor is tilling a new client/serverlike application that will deliver up-to-the-minute weather information over the Internet to customers. If all goes well, it could eventually put the satellites out to pasture.
Alden isn’t the only company pushing the Internet beyond its limits. Companies such as brokerage firms and nuclear fusion research houses are discovering that the Internet’s most compelling draw is not catalog shopping in cyberspace, but rather the lessons IT managers can learn when it comes to designing client/server systems. Internetlike distributed architectures–which use brokers to pass requests along between clients and servers–can be developed and maintained more easily than traditional architectur es, with the added benefits of code reuse and robust performance, experts say.
“In the next 20 to 24 months after people … don’t think the cybermall is the Holy Grail anymore, [they will see] that the real strength of the Internet is corporate communications,” says Jim Medalia, president of DXX’s Internet Business Center, an Internet service provider and consultancy, in New York. With concerns over security and reliability abating, Medalia and other Internet access providers say a growing number of corporate clients are exploring the possibility of distributing client/server applic ations over the Net.
Alden is taking steps toward that goal. The Westboro, Mass., firm has been delivering large graphics and real-time satellite images to a test group of university customers over the Internet for months “without any trouble,” says William Highlands, manager of data communications systems. Alden’s application, which uses Denver-based Unidata Inc.’s Local Data Manager on both the “client” and “server,” has been up and running over the Internet since last September. The LDM software transmits large amounts of a lphanumeric and graphical data over the Internet and packages it so that the client software can identify and reconstruct what was sent from the transmitting station.
Before Internet transmission becomes Alden’s modus operandi for more critical customers, such as the military and commercial airlines, Highlands admits that security and control issues will have to be more fully ironed out. But there’s a distinct possibility that the Internet could give Alden’s satellite system a run for the money. “[We'll] be able to reach more customers with less fuss,” says Highlands about the new Internet delivery system.
Cost and return on investment will also be a crucial factor before Alden’s satellites are replaced by the infobahn. According to President Arnold Kraft, small satellites and receivers cost the same as–or less than–an Internet server and software. “But that looks like it could change,” he says.
One element of the Internet that should be put to good use by application architects is its use of brokers–software that can handle requests for data or other services without knowing a lot of detail about either the requester or provider of the service. Cliff Commington, vice president of marketing at BBN Planet Inc., in Cambridge, Mass., says that architecture is critical. “The Internet is just an applications platform. What do you do with it? You build applications,” he says.
For example, each World-Wide Web server on the Net is in essence a broker, which uses hypertext links to provide information or pass on requests for data to another Web site even if it knows little about the systems at either end. Even if an application isn’t running over the Internet, a distributed, broker-based architecture can pay off in spades with reduced development time and increased flexibility.
That’s been the case at T. Rowe Price, in Owings Mills, Md., which is using the broker approach for the next version of its Client Access Inquiry System. This application, which lets customers dial in to mainframe databases to check financial information, such as the performance of funds, employs broker services by Open Environment Corp.’s Entera development tool. The system works in a three-tiered environment of clients, functionality servers, and database servers.
The brokers provided by Entera are the key to this architecture, says system architect Kirk Kness. They work alongside the three- tiered architecture, “like a big yellow pages,” keeping track of which application services are available on which servers and telling client applications where to find what they need. Because none of that information needs to be coded into the three tiers, it is easier to port the services to other platforms. It also reduces the need for cumbersome updates to the clients, Kness says.
T. Rowe Price’s next step will be to deploy the application over the Internet, which the company hopes to do as soon as the end of this year. “This application could run across the Internet with no changes,” says Kness, who added that it will “when we get good security and good encryption.”
One of the ultimate challenges will be to actually distribute application code across the Internet. Researchers at Massachusetts Institiute of Technology have taken a major step in this direction, with applications that help control the Alcator C-Mod tokamak, or experimental fusion reactor, at the school’s Cambridge, Mass., campus.
Tokamaks create extremely high-temperature plasmas to find ways to create extremely low-cost fusion power. With shrinking research budgets, there’s less money to build tokamaks, meaning researchers around the globe need to share existing ones. One solution would be to control tokamaks from a remote site, cutting down the cost and inconvenience of sharing.
That’s where the Internet comes in: This spring, MIT researchers working from the Lawrence Livermore National Laboratory, in Livermore, Calif., for the first time used ESnet (Energy Science Network), a subset of the Internet, to control and monitor experiments at MIT’s reactor in Cambridge.
The California researchers used a combination of Hewlett-Packard Co. workstations and a Silicon Graphics Inc. Indigo workstation, equipped with a camera to provide full-motion videoconferencing. ESnet also provided a 200K-bps link to Digital Equipment Corp. workstations in Cambridge, which control the tokamak, says staff scientist Steve Horne of MIT’s Plasma Fusion Center.
“We did not simply have people talking to each other,” says Horne. “We had the control panel of the machines and the displays on the [workstations] out there [on the Net].”
After solving an early problem with an intermediate node on ESnet that was dropping packets, response time was fast enough that the remote researchers could work as quickly as researchers in Cambridge– setting up one shot every 15 minutes. The setup even allowed for application partioning over the Net. Users could move data or logic between different platforms to avoid network bottlenecks simply by switching between X Windows screens on their workstations.
The lesson from these companies’ pioneering efforts: Whether you bring applications to the Internet, it’s smart to start bringing a little bit of Internet design to applications.