Silvano Rebai - Fotolia
About 60 miles north of Oahu, Hawaii, three miles below the surface, sits the world's deepest underwater observatory -- an ambitious project that sends power and Ethernet connectivity from the island all the way to the ocean floor.
The University of Hawaii's ALOHA Cabled Observatory (ACO) uses a retired undersea cable from AT&T to collect a stream of constant, real-time data that measures water pressure, oxygen levels, currents, temperature, salinity and more. Oceanographers say this wealth of information can shed light on issues ranging from climate change to earthquakes. The ACO even boasts live video and hydrophone capabilities, allowing researchers to record the songs of the migrating humpback whales that spawn in Hawaiian waters every winter.
In this edition of The Subnet, University of Hawaii IT specialist Brian Chee -- director and founder of the school's Advanced Network Computing Laboratory -- takes us into the deep sea and explains how he got the observatory network up to speed.
What is your role at the University of Hawaii?
Brian Chee: I work for the dean's office at the School of Ocean and Earth Science and Technology; we're the No. 3 oceanographic institute in the United States. We own the planes, the ships, the submarines and the undersea cable. I act like a communications or systems analyst for the dean's office, and then I get assigned to different projects on an as-needed basis.
Tell us about what you do with the ALOHA Cabled Observatory.
Chee: The [observatory] itself is about the size of a VW Beetle, and from that, we have plugs on the end where we use special connectors to terminate both fiber-optics and power simultaneously at pressure. And when I say ‘pressure,' I mean 500 atmospheres, which translates to 7,000-something PSI.
The dean's office assigned me to this project about five or six years ago when they were having issues with their network. I had to make the network more reliable and redesign it so that it would be Multi-tenant -- the idea being that it is literally an internet connection and a power connection three miles underwater. And by using remote operating vehicles, we can place and remove projects for various schools and research groups.
The problem with underwater cables is they are atrociously expensive. AT&T donated their [undersea cable], called HAW-4, to the university for $1 when it was retired. It was 1960s technology, deployed in the early 1970s, and it was the first fiber-optic cable between the continental U.S. and Hawaii. The cable actually comes up in Makaha, which is on the North Shore of Oahu. And it comes in through, literally, a bunker underground. Our equipment is in that cable landing station.
So we have the original AT&T systems, which provide control and so forth, and the repeaters are optical switches so that if, say, some idiot drags an anchor, hooks our cable and manages to break a strand, we can actually route around that broken strand. It also injects power down the line.
So you're still using that same undersea cable and those old network technologies today?
Chee: Yes. Undersea cables are expensive enough that, usually, no single country can afford them. So it's usually owned by a consortium of countries or corporations. So as they become retired, we are attempting to grab them.
For instance, we're actually trying to get TAT-12, which lands in Rhode Island. And we want to be able to pick that up, and then wrap it around the Titanic, so that it can have cameras and floodlights, and environmental sensors, and things like that. So we can actually measure the environment around the Titanic.
The reason why we're being even considered for this is because we're the only group on earth at the moment that knows how to convert the proprietary signaling that's on these undersea cables into a standard network communication. So we go from the proprietary AT&T signaling, to 100Base-FX -- which is an industry standard for [100 Mbps Ethernet over fiber optics] -- through a custom board that was designed and built for us by a retired AT&T engineer.
The ACO has two 10-foot long by 24-inch diameter titanium tubes. The equipment has to stay dry and at normal atmospheric pressure in those tubes, and that's where we convert to regular 100Base-FX. Then we use industrial, ruggedized networking gear: a combination of Cisco, Belden and Sixnet switches.
We also have a custom-made power supply down there that is computer-controllable, so we can turn power to each of those accessory plugs on and off, and we can also change the connection. We have copper connections that can either be serial for industrial control or Ethernet. And we have a switching device that can do that.
We also have CTD [devices], which measure conductivity, temperature and depth. From that, we can calculate the salinity of the water. And once our floodlights are fixed, then people will be able to watch the underwater world three miles down, live on the internet.
Can you talk more about the specifics of the network and the undersea cable? What issues were they having with the network when they brought you in?
Chee: It is a first-generation trans-oceanic fiber-optic cable, and we only have 100 Mbps full duplex down that pipe, which, in oceanographic terms, is a ton of space. A lot of our devices don't use up much bandwidth at all. They also don't use a ton of power. So 1,800 watts is a lot of power for oceanographers.
Now, as far as the network problems go, the oceanographers are not what you would call network specialists. And they thought, ‘Oh, we'll just put plenty of space on there,' and they got a Class A network, which is massive. A Class A could handle the entire state of Hawaii or the entire state of Alaska. I explained to them that, ‘No, there's a ramification to that. It means all your routers are going to instantly roll over and die.' So I convinced them to go to a Class B, and then I sectioned up the network -- very standard networking stuff, but very new to the world of oceanography.
I also swapped out devices so we had more smart devices. I wanted to be able to gather troubleshooting information -- SNMP and being able to ping devices -- without having to come all the way up the cable.
In one of the camera domes, I have an Opengear console server. And it's kind of a unique product in that I can feed power into the device, and it will feed power over Ethernet downstream. So that's how I'm powering one of my webcams; it's actually a commercial off-the-shelf webcam. It also gives me RS485, which is an industrial control serial connection, and that's how we control the lights. And it also gives me regular Ethernet so I can send it out to sensors, and I can check things like humidity and barometric pressure inside the dome.
How did you get into IT and, specifically, networking?
Chee: I got into IT in 1972. I was in middle school and had a job soldering together boards for IMSAI 8080s and Altair 8800s. Later on, I got a job working for Xerox Corp. as a printer product interfacing specialist, and our job was to interface Xerox equipment to non-Xerox equipment. One of the things that happened right around that time was this brand-new thing called networking.
Robert Metcalfe, at the Xerox Palo Alto Research Center, got a research paper written by Dr. Norman Abramson at the University of Hawaii, which was a project I was a ‘student slave' on. Anyway, he got that paper and Ethernet was born. At Xerox, I was tasked with teaching the salespeople how to sell networking and to install the networking gear, and we installed some of the first commercial networking products ever.
I got a full-time job with a regional distribution company in Honolulu, and through a lot of weird happenstances, they became the distributor for Novell NetWare, which was one of the first successful commercial networking companies on earth. And I became one of the first 10 certified Novell instructors. That was '87, I think. And then networking exploded.
I founded this lab mostly because, in a previous life, I was a senior computer scientist for the GSA Office of Information Security, and I'd hire kids just out of college with four-year university degrees, and then they were useless. I'd end up spending $20,000 or $30,000 training them, only to have them snarfed up by the private industry.
About 17 years ago, I was in my hospital room recovering from cancer surgery. I thought, ‘Have I made the world a better place?' And I felt I hadn't. So I vowed to myself that I was going to do something about it, and I founded [the Advanced Network Computing Laboratory] so that I can give computer science, engineering and other students some real-world experience.
Wow. That's quite a career.
Chee: Yeah, it's weird. When I first got in, I was the snot-nosed little kid. And the next thing, I look up and I'm considered one of the old farts of the industry.
OK, here's one of our rotating questions on personal interests: What's the best thing you can cook?
Chee: The dish that I think is my best -- but I don't do it very often because it's an amazing pain in the ass to make -- is beef Wellington. My grandfather was a Chinese cook at a relatively famous Chinese restaurant in Waikiki. Grandma used to get chased out of the kitchen by Grandpa with his cleaver. So I learned how to cook from Grandpa, and I do most of the cooking in my family.
NATO leans on video conferencing
Networking challenges at an underground Alaskan gold mine
How the Army used a network tunnel during the Ebola crisis
- Leeds Beckett University Modernized Campus Infrastructure with Extreme Networks –Extreme Networks
- How To Strengthen Security While Optimizing Network Performance –Gigamon
- Barracuda Lightboard: CloudGen Firewall –Barracuda
- Focus: Network virtualisation –ComputerWeekly.com
Dig Deeper on Network Infrastructure
Orange announces submarine cable linking US East Coast to France
Orange strengthens West African connectivity with first regional network
Google Cloud announces Grace Hopper subsea cable
MainOne West African submarine cable implements Infinera to boost spectrum capacity