Who: Tony Pearson is senior associate director, Video Communication Services, from the DELTA team at NC State
North Carolina State has long been a leader in distance learning – providing courses that students follow at home, at work, or wherever they are. We have been a leader in distance learning for nearly 40 years: today we serve more than 50,000 students around the world, and several of our online programs are rated in the top 10 by U.S. News and World Report (the leader in college rankings) and other reporting agencies.
About Distance Education and Learning Technology Applications (DELTA)
DELTA (opens in new tab)’s role within the Office of the Provost is to foster the integration and support of learning technologies in NC State’s academic programs, both on the campus and at a distance. We coordinate the funding and production of all distance-based credit programs and courses for the university. We promote high-quality education by extending the reach of the faculty and collaboratively applying expertise in technology and pedagogy in an efficient, effective and service-oriented environment.
Video Communication Services
Video Communication Services was started prior to DELTA in the 1980s. Courses were captured/recorded using physical media, tapes, then CDs, then DVDs that were shipped directly to students. During this time courses and special programs were also broadcast through many of the following methods, LPTV, Microwave, Satellite, ISDN, MPEG2, and H.323.
Prior to H.323, Satellite and Microwave were primarily used for synchronous learning: remote students were watching course lectures at the same time as those who were actually present on campus. They could respond to the professor via two-way video (H.323) and or phone bridges, giving a real sense of interaction with the class.
Once the internet became a reliable source to deliver high-quality material, we began to move our courses online. The benefits were obvious: not just lower cost to distribute, but immediacy, interactivity, and the improved ability to use the courses synchronously or asynchronously, thanks to enhanced collaboration applications, and video on demand access. What set NC State apart, was the demand for maintaining a professional, high-quality broadcast standard approach to their production for distance education courses. The workflow was created with a Realtime production process, which included live transmissions and or synchronous connections for designated courses every semester.
All this was achieved in the beginning by creating what was originally called a Studio Classroom, now referred to as a media-enhanced classroom. Although these classrooms were used by traditional students on campus as well as those at a distance, these spaces were not your standard classroom. All classrooms were full production studios with specialized lighting, a production switcher, character generators, desktop or ceiling microphones, and up to six cameras where required.
As our Distance Education programs grew, more classrooms were added and the need to increase classroom capacity was in high demand. During this period of growth, we started to see annual enrollment increases of 15% to 20%, which dictated the need for improved and efficient workflows.
As a result of the growth, all designated control rooms supporting media-enhanced classrooms were reduced to equipment closets to provide additional seating capacity inside the classroom. This required a swift transition from a control room modal with dedicated technicians or video directors. Once a space was identified for a centralized control room (CCR), phase one was implemented with two consoles to support, control, and monitor three media-enhanced classrooms remotely.
One challenge was how to repurpose existing fiber no longer needed for microwave transmissions: With the CCR located in the heart of the main campus, all existing fiber runs had to be rerouted and additional runs added for a home run back to Ricks Hall Annex.
The AV infrastructure design required the use of single mode fiber to transmit one SDI video source per fiber strand for a total of four sources per classroom. This placed serious limitations on the number of signals we could transmit back to the CCR. Including cameras and computer sources, a media-enhanced classroom might have anywhere from 12 to 16 video sources to be managed. The current fiber infrastructure only gave us four video channels per classroom. The AV design solution was to put a multiviewer in each classroom and send quad-split screen feeds over the fiber to the CCR.
That decision meant the technicians could see a limited number of sources and by using routing controls they could switch and manage various signals. Because the multiviewer-to-multiviewer design provided bundled feeds of all sources transmitted from each classroom, a handicap was created, and these feeds could not be unraveled to provide true confidence feeds of other remote sources. Technicians had to be really creative with routing in order to see, monitor and control everything.
Migration to IP
Moving forward, it was determined that we needed an open architecture solution to provide flexibility and scalability for the next generation of remote monitoring and control in our CCR. We also needed it to be standards-based, which would secure the future without becoming bound to a proprietary solution, which would limit future developments.
The process for selecting a solution took almost four years. We looked at a broad range of products that were presently on or entering the market. Due to our aging routers and multiviewer, our focus was on open architecture solutions using industry-standard interfaces from signal acquisition to signal delivery and monitoring. As we looked to move away from dedicated SDI-connected routers toward an all-IP processing platform it became obvious that an SDI to all IP solution would provide all the flexibility and scalability we required to replace systems that had reached end-of-life status. This also allowed us to continue use of our SDI legacy equipment as part of this new architecture.
The good news was that our research coincided with the development of the SMPTE ST 2110 standard. While we researched other standards and reviewed alternate proposals for IP connectivity, our conclusion was that the SMPTE ST 2110 standard represented the most stable future for our infrastructure. We also had to be realistic about how much we could achieve in the first phase. We could not jump to an all ST 2110 system: we had to maintain our SDI infrastructure into which we had invested so much. So, our solution was to build a new all-IP core at the CCR, providing us the flexibility to deploy SDI to IP devices in our classrooms.
The other challenge was the timeline. We had to fit this integration into the academic calendar, therefore we could not use any of the existing infrastructure until the end of the spring semester. We then had until mid-June to complete the installation. This left the month of July as our team’s learning time to dive in and figure out how to support these new technologies and be ready for the start of the fall semester in mid-August.
Armed with our criteria list, we spent years evaluating all the likely vendors. It became clear that the solutions from Imagine Communications (opens in new tab) were scoring high on the criteria list. Not only were they prepared to enter into a technical partnership with us, supporting us in all we were trying to do, but they provided products that really made our requirements a practicality. For example, the Selenio Network Processor (SNP) is a small box which contains four separate processing chains. You can set what you need it to do with software, but what we mostly use it for is for converting between SDI digital video and SMPTE ST 2110 IP connectivity.
So, we did not have to abandon any of our SDI gear, and we integrated a really powerful, flexible IP core. Imagine also provided EPIC multi-viewers, which integrate really well with the SNPs and provide us with the flexibility to set up whatever multiview displays we needed, wherever we needed. The SNPs also provide us with 32 individual signals from each location, so we have plenty of capacity, and can monitor as we need to. That capacity allows us to easily reach other media-supported areas and additional classrooms within the same building.
Having the additional multiview displays is also great for redundancy: we like to have as much of the system duplicated as possible to minimize the chances of down time. If the medium-term future calls for social distancing, SNP multiviewer control allows us to reduce the number of people in the CCR by posting support staff in designated equipment rooms. Thanks to the network, any technician in any location can control any combination of sources and destinations. This technology allowed us to take out a large SDI broadcast router and replace it with a completely non-blocking 256 x 256 HD router. The 1RU enterprise-class Cisco ethernet switch, which acts as the core of the IP router fabric, gives us the freedom to grow and connect anything to anything.
Working with the integration team, AV integrator, and Imagine, the system was up and running quickly and the DELTA team was ready to support classes within our 45-day window. Initial design and implementation still required us to encode sessions for delivery and record each inside the classroom. Four months following implementation, the redundant Selenio Network Processor (SNP) at the core was configured for IP to SDI, which allowed for the centralization of our classroom capture encoders at the core. It also created additional redundancy at the core, allowing us to switch over to standby encoders in the CCR. All of this is simply controlled by the Magellan SDN Orchestrator from Imagine.
A concern for us was that all sources had to be synchronized. In an IP network, that is best achieved using PTP (precision time protocol), which is defined as part of the ST 2110 standard. One of the features of ST 2110 is that it handles video and audio separately, rather than embedding the sound into the pictures. So, we needed excellent system timing to ensure lip-sync as well as clean switching. We needed to do this with minimal latency, as there are times when an operator needs to take control of a device in a remote classroom.
One of our biggest concerns during the installation period was fiber runs of over 10-kilometers might be too far for high bandwidth IP signals to reach. Good news: It was a non-issue.
We are very proud of what we have achieved. When we bring visitors into our CCR they are genuinely impressed by what they see. The system is designed to support eight monitoring stations, with seven currently in operation. We have 25 to 30 encoders that can be monitored at any time, from any of our facilities. These are streaming live, and at the end of a class, it is also available on demand within 15 seconds. All workstations include tallies and talk back for each classroom so if there is a problem the instructor simply pushes a help button. The technician can then use a reverse video feed to appear in the classroom and talk through and resolve any issue quickly.
What I am most proud of is that we have implemented a system that is transparent to the end users. Our faculty just go along with their teaching and everything works behind the scenes. That is the beauty of the system we designed with Imagine Communications solutions. We now have the flexibility and scalability to continue making NC State and DELTA the leader in distance learning.