Network engineering and telecommunications

Networking infrastructure is essential technology vital to UCAR’s ability to function and prosper in a rapidly evolving scientific and technical environment. Networking infrastructure enables many aspects of the scientific enterprise to flourish: business processes, scientific investigations and analysis, communication, global collaborations, and educational and outreach missions. Network infrastructure is literally the backbone of all other IT infrastructure and services. A sound and reliable network infrastructure is critical to building stable IT infrastructure at the higher levels. It is the goal of the networking infrastructure to provide this fast, robust, and flexible infrastructure to support all other IT services.

Networking is a critical component of cyberinfrastructure. Networks are interconnected and interoperate at the campus, metropolitan, regional, national, and international levels. Networking is a global endeavor. Being well connected has become a requirement for successful business operations but is especially important for a national research center such as NCAR/UCAR. Networking provides a vital service enabling all functions at UCAR. UCAR’s scientific mission is enabled and supported by networking. UCAR’s business operations, including interactions with funding agencies, also depend on networking. Collaborative science would not happen today without networks, just as UCAR’s business operations would not function.

The Network Engineering and Telecommunications Section (NETS) plans, engineers, installs, operates, maintains, develops strategy, and performs research for NCAR and UCAR’s state-of-the-art data networking and telecommunications facilities. NETS provides a vital service to NCAR’s research communities by linking scientists to supercomputing resources and each other. These activities are essential for the effective use of NCAR/UCAR’s scientific resources, and they foster the overall advancement of scientific inquiry. This work supports CISL’s computing imperative to provision hardware cyberinfrastructure for the atmospheric and related sciences. It also supports CISL’s computing frontier of center virtualization by providing infrastructure for science gateways and other Grid-based technologies.

NETS pursued these LAN projects in FY2014:

  • UCAR network infrastructure re-cabling
  • WASP inventory system
  • NWSC Infiniband cabling (major replacement effort)
  • ArcGIS
  • Plookup
  • ITC Strategic Plan participation (networking, collaboration, security, collocation)
  • Softphones
  • ML Room 034 remodel and NETS shop relocation
  • Cellular phone support
  • Network monitoring
  • CISL Nagios centralization transition
  • Netflow – Icmynetflow
  • Archibus
  • Extraview
  • Multicast support activities
  • Business continuity
  • Everbridge Emergency Notification System (ENS) participation
  • UPS, grounding, wireless networking, IPT, Collocation Facilities Management (CFM)
  • ML 29 infrastructure design
  • IPT server replacement
  • Spring and Fall power downs
  • Cisco 6500 to 4500 replacement plan and order
  • NETSDB replacement and redundancy
  • UIS database project (PeopleDB)
  • Contact list conversion
  • Telecommunication closet cooling
  • Vidyo® expansion
  • CG2 colocation shutdown and move to ML 29
  • BiSON Boulder Node B implementation
  • Network Time Protocol (NTP) server and service

NETS pursued these MAN projects in FY2014:

  • Boulder Point-Of-Presence (BPOP)
  • Boulder Research and Administration Network (BRAN)
  • I2 Dynes project completed
  • City of Boulder CG4 inter-building cabling

NETS pursued these WAN projects in FY2014:

  • Front Range GigaPoP (FRGP) ongoing management and engineering
    • New FRGP cost model
    • Thirty-two new or renewed Five-year agreements effective 7/1/14 – 6/30/19
    • Five participants left the FRGP (STAR, ARTstor, DenverHealth, CARL, and UCH)
    • 1850 Pearl Street colocation relocation
    • Expanded ESnet peering to 20 Gbps
    • UCAR Point of Presence (UPoP) merged with FRGP
  • New FRGP Participants: NEON, NREL, State of Wyoming, JeffcoSD, Colorado College, CSU-Pueblo, I2/USDA (pending)
  • National LambdaRail (NLR) shutdown
  • Internet2
    • Diversity Initiative Co-chair
    • Network/Connector Liaison
    • Frictionless Science Networking
  • Bi-State Optical Network (BiSON)
    • Design and order for Golden ring
    • Design and order for SCONE path
    • CSM BiSON hardware configuration implementation
    • UW NSF CC*NIE proposal support and network design
    • CU–B NSF CC*NIE proposal support
    • Boulder BiSON Node B network resiliency
  • XSEDE
  • Western Regional Network (WRN)
    • 100G Upgrade design and infrastructure and equipment order
  • NOAA Research Network (NWAVE)
  • The Quilt Project – National Regional Networks Consortium
    • Jeff Custard - Executive Committee
    • Jeff Custard - CIS Committee
    • Marla Meehl - Nominations Committee
    • Marla Meehl - Finance Committee
    • Marla Meehl – Research Working Group
    • Marla Meehl – CC*IIE Regional Collaboration Working Group
    • Fabian Guerrero – Spring and Fall travel grants
  • SC’13 SCinet Participation
  • RMCMOA proposal and award
  • Westnet Meeting Support
    • January 2014
    • June 2014

In FY2015, NETS will continue to provide support and enhancements for all of these essential networking services. NETS activities are primarily supported through UCAR Communications Pool indirect funds, the FRGP, and NSF Core funds.

Detailed project descriptions appear below for four of these projects: 1850 Pearl Colocation Move, FRGP Agreement and Cost Model, BiSON Expansion: Golden, Southern Colorado (SCONE), State of Wyoming, and the CG2 colocation move to ML and NWSC.

1850 Pearl colocation move

Network colocation
John Hernandez (left) and Scot Colburn, FRGP Network Engineers, are activating the FRGP equipment in the new colocation space.
Network colocation
Carlos Rojas Torres (left) and Armando Cisneros test fiber optic cabling in the new colocation space.

The Front Range GigaPoP (FRGP), the Bi-State Optical Network (BiSON), and the Western Regional Network (WRN) are critical wide area networking (WAN) infrastructure for UCAR and for all the FRGP, BiSON, and WRN participants in Colorado, Wyoming, California, New Mexico, and Washington. The FRGP and BiSON are managed by UCAR. The Level3 colocation space at 1850 Pearl Street in Denver is the primary hub for the FRGP, BiSON, and WRN WAN services.

The criticality and breadth of impact made this project particularly visible, and the effect was broad and large. As the contracting agent and manager of the FRGP and BiSON, UCAR was notified in mid-January 2014 that National LambdaRail (NLR) would be ceasing services with less than 30 days notice on 17 February 2014. The NETS FRGP team immediately starting working on contingency plans to avoid services shutdown. One of those critical services was the FRGP Level3 colocation services shared in NLR space. Ultimately, the NLR shutdown was on 17 March 2014, but this was still less than 60 days notice to move critical and complicated services.

Led by Pete Siemsen, the team of Bryan Anderson, Armando Cisneros, Scot Colburn, Susan Guastella, John Hernandez, and Carlos Rojas-Torres put in the long hours and hard work on this project and documented their accomplishments on a wiki page.

This team successfully led a multi-component move process under tight timeframes and off-hours to relocate one of the main colocation facilities for the FRGP in Denver. The FRGP had sub-contracted space in an NLR suite at 1850 Pearl Street in Denver for approximately 10 years, affording the FRGP significant cost savings over this time. Given the NLR shutdown, NETS was required to move out under short notice or face possible power down of all the key gear and connectivity this location provided. All FRGP equipment and cross connects moved from the existing Level3 NLR Suite to new cabinets on a different floor at the Level3 Denver Gateway at 1850 Pearl Street. NLR shutdown was confirmed for 17 March 2014, so there was no flexibility in the date or time due to the urgency and extreme nature of the hard power down of this facility. This project was successfully completed on 15 March 2014, two days before the deadline.

Through adept planning and efficient teamwork, this very complex project was completed in a very short amount of time. It was accomplished with great skill, professionalism, communication, and attention to detail on the part of all team members and also many outside organizations that had to be involved and coordinated including Level3, Internet2, ESnet, NOAA, and FRGP participants.

Due to efficiencies gained, the resulting cost of the new colocation is no more expensive than the shared NLR colocation that benefits all of the FRGP. The FRGP has also gained independence from shared colocation. We decided on this path in an attempt to avoid this type of exercise in the future.

FRGP agreement and price model

FRGP costs
The new FRGP cost model is simpler and more equitable, and the Cost Calculator allows participants to estimate costs for changing services.

Every five years the FRGP agreements expire and must be renewed, and 2014 was one of those years. This was a major effort in that there were significant revisions to the agreement dictated by recommendations from a UCAR internal audit of the FRGP and changing conditions in the network usage and services environment. A number of costs shifted from UCAR to the FRGP and had to be codified. In addition, it was decided to create a new, very different, and hopefully more equitable, cost model. Again this required major changes to the FRGP Agreement. Lastly, it was decided to eliminate the concept and organization known as the UCAR Point of Presence, and merge those functions and participants into the FRGP. Again, this required changes to the FRGP Agreement.

The development of the new cost model required multiple iterations, extensive discussion and meetings, and a lot of patience. The new model is simpler, fairer, and more flexible. However, it is somewhat complex to explain so a document was created to detail the model, and a calculator was developed to easily and quickly run scenarios. As a result of increased FRGP costs, five FRGP participants left the FRGP and that process had to be managed. Thirty-two participants remain with the FRGP, which is starting its 16th year of successful operation.

The FRGP is functioning very effectively and efficiently under the new agreement structure and cost model.

BiSON expansion: Golden, Southern Colorado, State of Wyoming

BiSON connectivity
The BiSON Golden ring will connect CSM, NREL, and JeffcoSD in a high-speed and highly reliable fiber optic ring.
Three regional networks
BiSON now reaches west to Golden and south to Colorado Springs and Pueblo to connect six new FRGP participants.
SCONE topology
The SCONE BiSON fiber optic network will provide over 10 times the current bandwidth available to CC, CSU-P, and CU-CS.

The BiSON team has been working to expand the BiSON footprint to Golden, Southern Colorado, and the State of Wyoming in Cheyenne, Wyoming. The FRGP team has worked to design the ADVA Wave Division Multiplexing (WDM) network that will use existing and new fiber to construct a multi-10 Gbps network to connect these new FRGP participants. This year the focus has been on building the relationships and partnerships, establishing the FRGP agreements, the network design, the hardware order, and working to make the fiber optic path whole.

The BiSON team has been working to expand the BiSON footprint. This expansion includes the Colorado School of Mines (CSM), Jefferson County School District (JeffcoSD), and the National Renewable Energy Lab (NREL) in Golden. CSM has been connected to BiSON for over a year using Colorado Department of Transportation (CDOT) fiber optics. Partnerships have been developed to leverage CSM fiber optic paths to expand the reach to NREL and JeffcoSD with minimal fiber construction. A non-redundant network has been installed and activated for CSM and NREL. JeffcoSD is pending a final fiber optic path installation. The next phase will be to install and implement the fully redundant fiber optic and WDM ring.

The BiSON team has also been working to expand the BiSON footprint to include Colorado College (CC) and the University of Colorado at Colorado Springs (CU-CS) in Colorado Springs, and Colorado State University at Pueblo (CSU-P). CU-CS and CSU-P have been connected to the FRGP using Colorado Department of Transportation (CDOT) fiber optics sharing a 1 Gbps path for a number of years. However, with increasing bandwidth demands and to include CC and optimize routing, it was decided to integrate this path into BiSON and for UCAR to assume engineering and management. The next phase will be to install and implement the full fiber optic WDM network.

Finally, the BiSON team expanded BiSON to connect the State of Wyoming government offices in Cheyenne, Wyoming and enable the State to become a FRGP participant. The State of Wyoming’s direct connection to the FRGP using BiSON greatly benefits the State of Wyoming as well as the other FRGP members who work with the State of Wyoming. This partnership strengthens and expands the existing relationship between UCAR/NCAR and the State of Wyoming. As part of the FRGP/BiSON, the State of Wyoming provides connectivity to approximately 400 K-12 schools that they are responsible for as part of the State CIOs office.

The State of Wyoming is dual-connected using fiber via the NWSC to the University of Wyoming as well as Bresnan/Level3 fiber via Stateline, and then to the FRGP/UPoP via the Bi-State Optical Network (BiSON). Intra-FRGP participant traffic stays within the FRGP network providing optimal direct connectivity.

CG2 colocation move to ML and NWSC

CG2 servers relocated
Teresa Shibao puts the final touches on the new colocation racks in ML 29 after the move from CG2.

The Colocation Facilities Management (CFM) team has been tasked with consolidating colocation facilities across UCAR for efficiencies and cost savings. As part of that effort, it was decided to consolidate equipment from CG2 2042 into the Mesa Lab room 29 (ML29) and the NWSC data center (NWSC Room CM-31). After many months of planning and documentation, this effort was successfully completed in July 2014.

NETS played a major and lead role in the planning, design, ordering of racks and components, installation of the colocation areas and network in advance, and in the actual move to the NWSC on 14 June 14 and to ML29 on 11 July – 13 July 2014. At the NWSC, as part of a pilot program between NETS and CISL, KVM and remote console connectivity was provide to each server relocated to the NWSC. At ML, NETS technicians upgraded and standardized the physical cabling to each server’s primary network, intra-network, and out-of-band connectivity. Each relocated server’s cable was standardized to Category 6A (F/UTP) cable, color coded for ease of management, electronically documented, and physically labeled. Both of these moves were implemented successfully and gracefully.