MSPP Network Design Case Study


A major healthcare services provider in a large metropolitan area is currently preparing to implement a major upgrade to its data and telecommunications networks. Backbone connectivity for University Healthcare System, Inc. (UHCS), in Brounsville is currently provided using multiple T1, DS3, and OC-3 links provided by a local exchange carrier, BrounTel. These leased lines are used for connectivity among various company locations, such as the corporate headquarters campus and various hospital locations throughout the metro area. These services are provisioned through a combination of BrounTel copper T1 span lines, legacy point-to-point asynchronous optical multiplexers, and first-generation SONET OC-3 and OC-12 systems. The company is seeking to upgrade its current network services for several reasons:

  • Network survivability To ensure business continuity, the UHCS IT staff wants to improve the reliability of its leased network services. In the past, the nonredundant portions of the network serving some of its locations have failed, causing unacceptable service outage times.

  • Flexibility A major driver in the upgrade decision is the capability to add bandwidth and services with the simple addition or upgrade of existing components, versus the delay associated with conditioning new T1 span lines or adding fiber facilities.

  • Scalability UHCS seeks to future-proof its infrastructure so that the network can grow as the business continues to expand.

  • Advanced network services Line-rate and sub-rate GigE connections are among the current network requirements, and storage networking, 10 GigE, and wavelength services could become requirements in the long term. These services cannot be provided using the existing BrounTel network facilities that serve the UHCS locations.

  • Cost reduction By reducing their network to a simpler, more advanced technology platform, the company plans to reduce recurring costs paid to BrounTel.

After discussing service requirements and contract terms with BrounTel, the IT managers have elected to contract with BrounTel to provide a leased dedicated SONET ring (DSR) service for connectivity between company sites, and for access to the public switched telephone network (PSTN). BrounTel will deploy a Cisco ONS 15454 MSPP solution for the DSR.

MSPP Ring Network Design

A total of seven sites in various parts of the metro area will need connectivity to the new network. BrounTel has existing standard single-mode fiber (SMF) optic cables serving some of the locations; it will use existing cable or build new optical cable facilities as required for diverse routing between the company locations and multiple BrounTel central offices. Three central office locations will have MSPP nodes on the SONET ring; the others will serve as fiber patch (or pass-through) locations. Table 6-8 gives a list of location names, addresses, and site types.

Table 6-8. University Healthcare System DSR Locations and Requirements

Node Number

Site Name

Site Address

Site Type

0

University Hospital North (UHCS)

4303 Thach Avenue

MSPP Node

-

Foy Central Office (BrounTel)

307 Duncan Drive

Fiber Patch

1

Magnolia Central Office (BrounTel)

41 Magnolia Avenue

MSPP Node

2

University Healthcare HQ (UHCS)

130 Donahue Drive

MSPP Node

-

Poplar Street Central Office (BrounTel)

8016 Poplar Street

Fiber Patch

3

Brounsville Main Central Office (BrounTel)

2004 Elm Street

MSPP Node

4

University Medical Center (UHCS)

34 South College Street

MSPP Node

-

Beech Street Central Office (BrounTel)

2311 Beech Street

Fiber Patch

5

University Hospital East (UHCS)

1442 Wire Road

MSPP Node

-

Roosevelt Drive Central Office (BrounTel)

1717 Roosevelt Drive

Fiber Patch

6

Samford Avenue Central Office (BrounTel and IXC POP)

940 Samford Avenue

MSPP Node

-

Mell Street Central Office (BrounTel)

1183 Mell Street

Fiber Patch

7

University Hospital South (UHCS)

9440 Parker Circle

MSPP Node

-

Haley Central Office (BrounTel)

1957 Concourse Way

Fiber Patch

-

Ross Central Office (BrounTel)

60 Wilmore Road

Fiber Patch

8

UHCS Data Center (UHCS)

1983 Draughon Trace

MSPP Node

-

Ramsay Central Office (BrounTel)

2322 Hemlock Drive

Fiber Patch

9

Jordan Memorial Hospital (UHCS)

1969 Goodwin Lane

MSPP Node

-

Morrison Central Office (BrounTel)

141 Morrison Drive

Fiber Patch


Table 6-9 shows the measured (for existing facilities) or calculated (for proposed facilities) fiber cable loss figures for the ring facilities. All loss figures include losses because of splices, connectors and patch panels, as well as the cable loss.

Table 6-9. Fiber Cable Losses for UHCS Ring Network

Fiber Span Number

From Location

To Location

Distance (km)

Loss at 1310 nm (dB)

Loss at 1550 nm (dB)

1

University Hospital North (UHCS)

Foy Central Office (BrounTel)

12.3

7.42

5.38

2

Foy Central Office (BrounTel)

Magnolia Central Office (BrounTel)

13.8

8.12

5.79

3

Magnolia Central Office (BrounTel)

University Healthcare HQ (UHCS)

17.1

9.53

6.73

4

University Healthcare HQ (UHCS)

Poplar Street Central Office (BrounTel)

6.2

3.73

2.76

5

Poplar Street Central Office (BrounTel)

Brounsville Main Central Office (BrounTel)

7.1

4.15

3.08

6

Brounsville Main Central Office (BrounTel)

University Medical Center (UHCS)

18.9

11.16

6.98

7

University Medical Center (UHCS)

Beech Street Central Office (BrounTel)

9.4

5.92

3.70

8

Beech Street Central Office (BrounTel)

University Hospital East (UHCS)

8.7

5.48

3.43

9

University Hospital East (UHCS)

Roosevelt Drive Central Office (BrounTel)

13.2

8.58

5.37

10

Roosevelt Drive Central Office (BrounTel)

Samford Avenue Central Office (BrounTel and IXC POP)

7.0

4.73

3.25

11

Samford Avenue Central Office (BrounTel and IXC POP)

Mell Street Central Office (BrounTel)

12.3

7.06

4.97

12

Mell Street Central Office (BrounTel)

University Hospital South (UHCS)

21.0

11.38

7.11

13

University Hospital South (UHCS)

Haley Central Office (BrounTel)

8.3

4.32

2.70

14

Haley Central Office (BrounTel)

Ross Central Office (BrounTel)

12.4

8.10

5.65

15

Ross Central Office (BrounTel)

UHCS Data Center (UHCS)

13.8

8.99

6.07

16

UHCS Data Center (UHCS)

Ramsay Central Office (BrounTel)

6.7

4.25

2.66

17

Ramsay Central Office (BrounTel)

Jordan Memorial Hospital (UHCS)

8.3

4.17

3.13

18

Jordan Memorial Hospital (UHCS)

Morrison Central Office (BrounTel)

10.3

7.72

5.33

19

Morrison Central Office (BrounTel)

University Hospital North (UHCS)

8.4

4.6

3.55


Because of the relatively short distances between MSPP node locations, polarization mode dispersion (PMD) will not be an issue in this deployment.

To determine the bandwidth requirements for the ring, the UHCS service demands must be considered. These services will be provided using the DSR:

  • Multipoint Switched Ethernet will be used to connect the LANs at all the UHCS sites together using a resilient packet ring (RPR) with GigE links.

  • Private line Ethernet connections, which are point-to-point Ethernet transport "pipes," will be used between a subset of the UHCS sites.

  • TDM services, including DS1 and DS3 links, will be required for transport of voice traffic between UHCS Private Branch eXchange (PBX) systems, and between UHCS sites and BrounTel central offices.

Listing the requirements individually, take a look at Table 6-10, which shows the planned circuits for the ring.

Table 6-10. DSR Circuit Requirements

Circuit Type

Circuit Size

Quantity

Protection?

Locations

Purpose

GigE

STS-24c

1

None

Node 0 to Node 2

RPR for production data

GigE

STS-24c

1

None

Node 2 to Node 4

RPR for production data

GigE

STS-24c

1

None

Node 4 to Node 5

RPR for production data

GigE

STS-24c

1

None

Node 5 to Node 7

RPR for production data

GigE

STS-24c

1

None

Node 7 to Node 8

RPR for production data

GigE

STS-24c

1

None

Node 8 to Node 9

RPR for production data

GigE

STS-24c

1

None

Node 9 to Node 0

RPR for production data

GigE

STS-24c

1

UPSR

Node 0 to Node 2

Private line video and data application

GigE (Sub-Rate)

STS-12c

1

UPSR

Node 0 to Node 5

Private line video and data application

GigE (Sub-Rate)

STS-12c

1

UPSR

Node 0 to Node 7

Private line video and data application

DS3

STS-1

2

UPSR

Node 1 to Node 8

Data access

DS3

STS-1

3

UPSR

Node 2 to Node 6

Data access

DS3

STS-1

1

UPSR

Node 2 to Node 3

Data access

DS1

VT1.5

3

UPSR

Node 0 to Node 9

PBX voice trunks

DS1

VT1.5

4

UPSR

Node 0 to Node 2

PBX voice trunks

DS1

VT1.5

5

UPSR

Node 1 to Node 8

Voice access

DS1

VT1.5

2

UPSR

Node 2 to Node 9

PBX voice trunks

DS1

VT1.5

4

UPSR

Node 2 to Node 8

PBX voice trunks

DS1

VT1.5

3

UPSR

Node 2 to Node 7

PBX voice trunks

DS1

VT1.5

22

UPSR

Node 2 to Node 6

Voice access

DS1

VT1.5

3

UPSR

Node 2 to Node 5

PBX voice trunks

DS1

VT1.5

4

UPSR

Node 2 to Node 4

PBX voice trunks

DS1

VT1.5

2

UPSR

Node 2 to Node 3

Voice access

DS1

VT1.5

3

UPSR

Node 4 to Node 8

PBX voice trunks

DS1

VT1.5

2

UPSR

Node 5 to Node 7

PBX voice trunks

DS1

VT1.5

2

UPSR

Node 7 to Node 9

PBX voice trunks

DS1

VT1.5

2

UPSR

Node 7 to Node 8

PBX Voice Trunks


To calculate the bandwidth requirements for the ring, simply add each of the individual requirements to arrive at the total number of STS-1s needed. This calculation is shown in Table 6-11.

Table 6-11. Ring Bandwidth Requirements

Circuit(s)

Number of Ring STS-1s Required

(7) Unprotected GigE RPR links

24

(1) UPSR-Protected Line-Rate GigE

24

(2) UPSR-Protected Sub-Rate GigE (12 STS-1)

24

(6) UPSR-Protected DS3s

6

(61) UPSR-Protected DS1s

3

Total STS-1s Required

81


Note

The GigE links that form the RPR reuse the same bandwidth throughout the ring because they are built without SONET protection.


Because the initial requirements are 81 STS-1s, an OC-192 ring will be used. This allows sufficient capacity for the existing service requirements, as well as for future growth to the network.

OC-192 Ring Transmission Design

Having defined the network bandwidth requirements to be OC-192, you can now select the appropriate OC-192 interfaces to equip at each ONS 15454 MSPP node to link the ring sites. ONS 15454 OC-192 IR interfaces transmit at a nominal wavelength of 1550 nm and have an allowable link loss budget of about 13 dB. OC-192 LR interfaces also transmit at the 1550 nm wavelength and have an allowable link loss budget of 26 dB. There is also an available SR OC-192 interface, but the small allowable link loss budget is not suitable for the distances involved in the UHCS application. Therefore, either the IR or LR optics will be used, with 10 dB being the "breakpoint" between the two. This allows 3 dB of margin for future loss increases due to fiber cable degradation, future repair splicing, and component aging.

Based on the specifications of the various ONS 15454 OC-192 interfaces and the loss characteristics of each fiber section (outlined in Table 6-9), the node-to-node interface types can be determined for the ring. OC-192/10G operation is allowed in chassis Slots 5, 6, 12, and 13. You use a pair of these slots at each location for the East- and West-facing ring interfaces. Although any combination of two of the four available slots is acceptable, uniformly select Slots 5 and 6 at each of the nodes for operational simplicity. Table 6-12 shows the selection of OC-192 optics for each ring span.

Table 6-12. OC-192 Ring Optics for UHCS DSR

From East Node/Slot

To West Node/Slot

Loss at 1550 nm (dB)

OC-192 Interface Type

Node 0 Slot 6

Node 1 Slot 5

11.17

OC-192 1550 LR

Node 1 Slot 6

Node 2 Slot 5

6.73

OC-192 1550 IR

Node 2 Slot 6

Node 3 Slot 5

5.84

OC-192 1550 IR

Node 3 Slot 6

Node 4 Slot 5

6.98

OC-192 1550 IR

Node 4 Slot 6

Node 5 Slot 5

7.13

OC-192 1550 IR

Node 5 Slot 6

Node 6 Slot 5

8.62

OC-192 1550 IR

Node 6 Slot 6

Node 7 Slot 5

12.08

OC-192 1550 LR

Node 7 Slot 6

Node 8 Slot 5

14.42

OC-192 1550 LR

Node 8 Slot 6

Node 9 Slot 5

5.79

OC-192 1550 IR

Node 9 Slot 6

Node 0 Slot 5

8.88

OC-192 1550 IR


In addition to the optical trunk interface card selection, one of the BrounTel central office nodes will be designated as the Gateway Network Element (GNE) and will connect to BrounTel's Network Operations Center (NOC) using its IP-based interoffice management network. The Magnolia central office MSPP node will be chosen as the GNE.

For the purposes of network synchronization, each MSPP node located in a BrounTel central office will be connected to the office BITS and will be configured as externally timed. Line timing will be configured on the ONS 15454 systems located in the UHCS customer premises sites, with the OC-192 optical interface ports on the cards installed in Slots 5 and 6 serving as the primary and secondary reference sources.

With all the necessary parameters now defined, you can prepare all necessary engineering documentation, such as the network map, shelf card slot assignments, chassis EIA equipage, tributary protection group configuration, and cabling termination assignments.

Network Map

The network map or ring map is a key piece of documentation that assists in bringing an MSPP network online, as well as a future as-built reference for planning, troubleshooting, and performing upgrades or additions. Figure 6-13 shows the network map for the UHCS DSR. The following information has been included:

  • Graphical representation of the network topology

  • Location of all MSPP nodes and fiber pass-through offices (no equipmentjust interconnection of outside plant fiber cables)

  • Distance and loss/attenuation figures for each fiber link

  • Slot assignments and card types for the interconnecting OC-192 interface cards

  • Node names, IP addresses, and timing configurations

  • GNE assignment and IP address of the default router

  • Software version

    Figure 6-13. UCHS DSR Ring Map

The node numbers and Ring ID have also been provided as reference information; however, because this is not a BLSR network, this information is not required to provision the ring nodes.

Shelf Card Slot Assignments, EIA Equipage, and Tributary Protection Group Configuration

The versatility of the ONS 15454 MSPP gives the BrounTel engineers multiple options when selecting the chassis card slot assignments. Some assignments will be common for all ring nodes; others might vary to allow for maximum flexibility to add future services to the network.

An important factor in the determination of card slot assignments for electrical interface cards, such as the DS1 and DS3 cards in the UHCS ring, is the type of tributary card protection required. Additionally, these interface types and their locations will help to determine the type of EIA that must be ordered for the ONS 15454 chasses.

All 10 of the ring nodes have the TCCP2 cards in Slots 7 and 11. Recall that two TCC cards are required in every ONS 15454 MSPP node. Likewise, two XC-VXC-10G cross-connect cards will be placed in Slots 8 and 10 for each node. Finally, for standardization and operational simplicity, the OC-192 ring interface cards will be placed in Slots 5 (West) and 6 (East) in each of the ring nodes. For the other interface cards required in the ring nodes, customized slot assignments will be specified. According to the conditions of the DSR service contract, BrounTel will design all TDM service interfaces to be card-protected in either 1:1 or 1:N protection groups. Of course, the Ethernet service interfaces will be unprotected.

At the University HospitalNorth node (Node 0), the initial service-termination requirements include seven DS1 circuits, three private-line GigE links (one line rate and two subrate), and the GigE RPR circuit connections. Figure 6-14 shows the shelf diagram with card locations for this node. The DS1s require a single working DS1-14 card. This card will be placed in Slot 4, with a DS1N-14 card installed in Slot 3 for protection. A 1:N protection group will be established for these cards, as indicated in the diagram. The use of Slots 3 and 4 will allow for future growth in DS1s; Slots 1 and 2 are left vacant for potential future DS3 requirements. Two G1K-4 cards are required because of the necessity of a line-rate (STS-24c) circuit, and two sub-rate circuits (STS-12c each), whose combined bandwidth exceeds 12 STS-1s. Slots 16 and 17 will be used for these cards. Note that the GBIC types for the ports to be equipped are indicated on the shelf diagram. The "SX" designation indicates that these ports will be equipped with 1000Base-SX (850-nm) GBICs. A single ML-Series card is required to connect this node to the RPR overlay. Because UHCS requires redundant GigE links from this interface, an ML1000-2 card with dual 1000Base-SX SFPs is specified, and this card will be installed in Slot 15. Also, an AIC-I card will be installed in Slot 9 so that, through the SONET overhead, the BrounTel network operations center can monitor the contact closure alarms from the associated direct current (DC) power plant. Card Slots 1, 2, 12, 13, and 14 will not be used for service interfaces initially and will be equipped with blanks.

Figure 6-14. University Hospital North NodeShelf Diagram


The DS1 interfaces in this node require backplane electrical connections. Because of the requirement for possible future DS3s, a UBIC-H EIA will be installed on Side A of the rear of the chassis. This enables both DS1 and DS3 interfaces to be cabled out from the rear of the shelf. Because all current and future requirements for Side B are front-cabled interface cards (Ethernet, storage, or optical), an EIA is not required to be installed on the rear of Side B, and the default blank cover can be used.

Each of the other MSPP nodes in the UHCS ring will be designed using similar interfaces, with an eye on future network-expansion requirements. A brief description of the requirements, interfaces, and protection groups for each is given in the next several sections, along with an accompanying shelf diagram.

Magnolia Central Office (Node 1)

Requirements are for five DS1s and two DS3s. Both interface types will be slotted on Side B, and a UBIC-H EIA will be equipped on the rear to accommodate the cabling for these cards. See Figure 6-15 for the shelf diagram.

Figure 6-15. Magnolia Central Office NodeShelf Diagram


UCHS Headquarters (Node 2)

Initial requirements are 44 DS1s, 4 DS3s, 1 line-rate private line Ethernet circuit, and the RPR GigE links. The high-density (56-port) DS1 card will be used on Side B, with the DS3 cards on Side A. Both shelf sides will be equipped with the UBIC-H EIA. See Figure 6-16 for the shelf diagram.

Figure 6-16. UCHS Headquarters NodeShelf Diagram


Brounsville Main Central Office (Node 3)

Requirements are for two DS1s and one DS3. Both interface types will be slotted on Side B, and a UBIC-H EIA will be equipped on the rear to accommodate the cabling for these cards. See Figure 6-17 for the shelf diagram.

University Medical Center (Node 4)

Initial requirements are for seven DS1s, as well as the RPR GigE links. DS1 interface cards and a UBIC-H EIA will be installed on Side B, with the ML1000-2 card and associated SX SFPs on Side A. See Figure 6-18 for the shelf diagram.

Figure 6-17. Brounsville Main Central Office NodeShelf Diagram


Figure 6-18. University Medical Center NodeShelf Diagram


University HospitalEast (Node 5)

Five DS1s, a single subrate GigE private line circuit, and the RPR GigE links must be provisioned at this location. A DS1-14 card will be placed in Slot 14 and will be protected by a DS1N-14 card in Slot 15. The Ethernet cards will be installed on the A Side. Because electrical interfaces are required to be cabled only from the B side of the shelf, no EIA will be required to be installed on Side A. See Figure 6-19 for the shelf diagram.

Figure 6-19. University HospitalEast NodeShelf Diagram


Samford Avenue Central Office (Node 6)

Initial requirements are for 22 DS1s and 3 DS3s. A pair of DS1-14 cards will be placed in Slots 16 and 17, with both cards being protected in a 1:N group by a DS1N-14 card in Slot 15. The DS3-12E card will be placed in Slot 2 and will be protected by a DS3N-12E card in Slot 3. UBIC-H EIAs will be installed on the rear for both sides. See Figure 6-20 for the shelf diagram.

Figure 6-20. Samford Avenue Central Office NodeShelf Diagram


University HospitalSouth (Node 7)

Nine DS1s, a subrate (STS-12c) GigE private line connection, and the RPR GigE links are required. See Figure 6-21 for the shelf diagram.

Figure 6-21. University HospitalSouth NodeShelf Diagram


UCHS Data Center (Node 8)

Initial requirements include 14 DS1s, 2 DS3s, and the RPR GigE links. Electrical interfaces will be installed only on Side B. See Figure 6-22 for the shelf diagram.

Figure 6-22. UCHS Data Center NodeShelf Diagram


Jordan Memorial Hospital (Node 9)

Seven DS1s and the RPR GigE links are the initial requirements here. See Figure 6-23 for the shelf diagram.

Figure 6-23. Jordan Memorial Hospital NodeShelf Diagram


Cabling Terminations

The UHCS DSR requires various interface types to be cabled out for interconnecting to the customer premises equipment (CPE) at UHCS locations, or for interfacing with the BrounTel or Interexchange Carrier (IXC) networks at the central office locations. Figure 6-24 shows a diagram with typical interface cabling for an ONS 15454 node location on the UHCS ring: the UHCS Headquarters node. The OC-192 ring optics will be cabled to the outside plant (OSP) fiber-termination panel using single-mode optical fibers from the SC faceplate connectors. DS1 and DS3 interface cards will be cabled to digital signal cross-connect (DSX) panels via the backplane UBIC EIA connectors. Ethernet interface cards, including the G1K-4 and ML1000-2, will be cabled to an optical splitter module panel using multimode fibers from the GBIC SC faceplate connectors (G1K-4) or the SFP LC faceplate connectors (ML1000-2).

Figure 6-24. Cabling Termination Diagram, UCHS Headquarters Node


Table 6-13 shows an example cabling termination assignment chart for the UHCS Headquarters location.

Table 6-13. Customer Drop Cabling Terminations for UHCS HQ Node

Panel Type

Panel Location

Connector/Jack

To Equipment

To Slot/Port

Optical

RR 101 PNL 4

1

ONS RR 101A

Slot 5 Tx

2

ONS RR 101A

Slot 5 Rx

3

ONS RR 101A

Slot 6 Tx

4

ONS RR 101A

Slot 6 Rx

5-24

FUTURE

FUTURE

Optical/Splitter

RR 101 PNL 3

Mod 1 SRC Tx

ONS RR 101A

Slot 4/1 Tx

Mod 1 SRC Rx

ONS RR 101A

Slot 4/1 Rx

Mod 1 Cus Tx

ONS RR 101A

CPE Tx

Mod 1 Cus Rx

ONS RR 101A

CPE Rx

Mod 2 SRC Tx

ONS RR 101A

Slot 14/1 Tx

Mod 2 SRC Rx

ONS RR 101A

Slot 14/1 Rx

Mod 2 Cus Tx

ONS RR 101A

CPE Tx

Mod 2 Cus Rx

ONS RR 101A

CPE Rx

Mod 3 SRC Tx

ONS RR 101A

Slot 14/2 Tx

Mod 3 SRC Rx

ONS RR 101A

Slot 14/2 Rx

Mod 3 Cus Tx

ONS RR 101A

CPE Tx

Mod 3 Cus Rx

ONS RR 101A

CPE Rx

DSX-1

RR 101 PNL 2

1-56

ONS RR 101A

Slot 16/1-56

57-84

FUTURE

FUTURE

DSX-3

RR 101 PNL 1

1-12

ONS RR101A

Slot 2/1-12





Building Multiservice Transport Networks
Building Multiservice Transport Networks
ISBN: 1587052202
EAN: 2147483647
Year: 2004
Pages: 140

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net