Top Banner
NOTTINGHAM TRENT UNIVERSITY SCHOOL OF SCIENCE AND TECHNOLOGY Video Conferencing in Wimax Kolappan Bhoothalingam Ramalekshmi 2009 Project Report in Partial Fulfilment of the Requirements for the Degree of MSc. ENGINEERING IN CYBERNETICS AND COMMUNICATION
37
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Video

NOTTINGHAM TRENT UNIVERSITY SCHOOL OF SCIENCE AND

TECHNOLOGY

Video Conferencing in Wimax

Kolappan Bhoothalingam Ramalekshmi

2009

Project Report in Partial Fulfilment of the Requirements for the Degree of

MSc. ENGINEERING IN CYBERNETICS AND COMMUNICATION

Page 2: Video

i

Abstract

The real time transfer of large size video data in video

conferencing applications through wireless networks poses many challenges like

transmission delay, interference due to air interface, packet loses and low data

rate problems. In this work the implementation of a video conferencing system

which can be used for communicating between computer devices in a network

having windows operating system was done with the help of an online open

source example. A general algorithm which can be used for designing a video

conference system with the help of tools identified through the background

research on real time communication system was also proposed. By using the

video conferencing application implemented and the windows meeting space

software available in the windows vista operating system, the real time transfer

of video data has been tested with the help of various tests done in a wifi

network and a wired network under various conditions and the packet analysis

for the data transferred in these networks was done by using wireshark

software. Various factors which are influencing the transfer of video data like

size, quality and data compression were analyzed with the help of simple tests.

The test results obtained were used to analyze the feasibility of this application in

a Wimax network which was found to be a best option for using these kinds of

applications.

Page 3: Video

ii

Acknowledgement

I would like to thank my project Supervisor Mr.Wayne Cranton for his

guidance throughout the whole project work and for spending his precious time

in giving valuable suggestions which guided me in a right way to perform this

work.

I would like to thank Professor Mr. Abbot Frank for his encouragement

and enthusiasm shown in the MDM project which triggered me to work efficiently

to reach my goal.

I thank my MDM project team members Mr. Srikanth Pentapati and

Mr. Rajesh for their valuable support and suggestions.

I heartfully thank the faculty members of Nottingham Trent University

for their support.

Last but not least I take this oppurtunity to thank my friends for the

moral support and encouragement given by them.

Page 4: Video

iii

Table of Contents

Abstract i

Acknowledgements ii

Contents iii

List of figures and tables vi

1 INTRODUCTION

1.1 Aim 1

1.2 Objective 1

1.3 Involvement with Multidisciplinary Project 1

1.4 Importance of real time communication 1

1.5 Introduction to Wimax technology 1

1.6 Structure of the work 2

2 Background of Real Time Communication

2.1 Sockets, Ports and streaming 3

2.1.1 Sockets 3

2.1.2 Ports 3

2.1.3 Streaming 3

2.2 Encoding and Packetization 3

2.2.1 Encoding 3

2.2.2 Packetization 4

2.2.3 Error Correction 4

2.3 Protocols for Real Time Communication 5

2.3.1 Real time Transport Protocol 5

2.3.2 Resource Reservation Protocol 5

2.4 Overview of Wimax technology 5

2.4.1 Layers of IEEE 802.16 6

2.4.2 Use of MIMO technology and OFDM 6

3 Project Background and Proposed Plan

3.1 Project Background 7

3.2 Proposed Plan 7

Page 5: Video

iv

3.3 Methods followed 7

4 Algorithms and Software Description

4.1 Algorithm 9

4.1.1 Transmission Node: 9

4.1.1.1 Application Layer 9

4.1.1.2 Transmission Layer 9

4.1.2 Reception mode 10

4.1.3 Communication 11

4.2 Software Description 11

4.2.1 Application Description 12

4.2.2 Initialization 12

4.2.3 Transmission node 12

4.2.4 Reception node 12

4.2.5 Video Conference between two nodes 12

4.2.6 Conference between multiple nodes 13

5 Experiments and Result Analysis

5.1 Windows Meeting Space 14

5.1.1 Experimental Setup 14

5.1.2 Real time video transfer Analysis 14

5.1.2.1 IEEE 802.11n Wireless Local area Network 14

5.1.2.2 ADHOC Wireless Network 15

5.1.2.3 Ethernet Wired Network 15

5.2 Video Conferencing Software 15

5.2.1 Experimental Setup 15

5.2.2 Analysis of Video conferencing application test 16

5.2.2.1 IEEE 802.11n Wireless Local area Network 16

5.2.2.2 ADHOC Wireless Network 16

5.2.2.3 Ethernet Wired Network 16

5.2.2.4 Packet Analysis 16

5.2.3 Quality Test with Different Formats 16

Page 6: Video

v

5.3 Video Size test 17

5.4 Feasibility Of the Application in a Wimax Network 17

5.4.1 Application layer 17

5.4.2 Medium Access Control layer 18

5.4.3 Physical Layer 18

6 Conclusion and Future work

6.1 Conclusion 19

6.2 Future Work 19

References 21

Appendix 22

Page 7: Video

vi

List of figures and tables

List of Figures

Figure 1.2 IEEE 802.16 Protocol Architecture (Taken from [10]) 6

Figure 1.4 Transmission node of Application layer 9

Figure 2.4 Transmission layer 10

Figure 3.4 Reception layer 10

Figure 4.4 Reception node Application layer 10

Figure 5.4 A video conference system 11

Figure 6.4 Video Conferencing application 11

List of Tables

Table 1.5 Test results of Real time video transfer using Windows meeting space 14

Table 2.5 Results of Video conferencing application test 15

Table 3.5 Results of Quality test 16

Table 4.5 Results of video size test 17

Page 8: Video

1

1 INTRODUCTION

1.1 Aim

To test the real time transfer capability of a video data in a wireless network by designing a video conferencing system and use the results obtained to analyze the capability of a wimax network to handle the real time video communication.

1.2 Objective

First objective is to design a general algorithm for developing a video conference application. This will be achieved by using the study of basic components essential for the real time communication of video data through background research and the general algorithm will be proposed.

Using a flexible design tool the video conferencing system should be developed based on the general algorithm. By running this conference system application and other software which will transfer the video data to other computers between two systems, the real time video data transfer between two nodes has to be tested and analysed. Different kind of network environments should be used for the test for an effective analysis.

A research should be done about analysing the capabilities of wimax network and its various layers and its support for the real time communication. The results of the practical analysis done by using the video conferencing application will be used to evaluate the feasibility of the wimax network in supporting real time transfer of video data.

1.3 Involvement with Multidisciplinary Project

This thesis work was done as a part of the Multidisciplinary project which is deployment of wimax network in a school in Nottingham city connecting it with other organisations for learning purpose. This project brings expertise from universities, various companies and several Institutions together and set the plans for the deployment. As the member of the multidisciplinary team, my specific role is to design a real time video communication system which can be used in the wimax network to test the capability of the network.

1.4 Importance of real time communication

The real time communications has been mainly used in applications like e-learning, video and audio conferencing, transport control etc. Most of the critical meeting which involves members from different geographic location can be made simpler and easier with the help of this real time conference system. They also have the potential of being applied in controlling certain devices from the remote locations. The real time applications need a high transmission efficiency with less delay or ideally no delay in the delivery of data packets.

1.5 Introduction to Wimax technology

Wimax technology is based on IEEE 802.16 standard and it will provide a wire free broad band access at a high speed with better coverage. This will eliminate the high installation costs and saves installation time in laying the DSL cable for providing the broadband access. Initially IEEE 802.16 standard was defined to

Page 9: Video

2

support only fixed wireless broadband solution, but later the revised version IEEE 802.16e was made to support the mobile access as well. [8][10]

It uses microwave for the transmission. The Wimax infrastructure consists of a base station tower which transmits the signal to subscriber station connected to the user computers. The base station tower can transmit a highly directional signal to another tower called backhaul with high data rate for long distance communication. For this transmission both of the towers need to be in line of sight. This will be useful in providing broadband access service to the remote areas where the installation of cable is not a cost effective solution.[8][10]

1.6 Structure of the work

All the remaining chapters are structured as follows

Chapter 2:

This chapter covers the background of the essential tools and concepts for designing a real time video communication system and also gives an overview about the features of wimax technology.

Chapter 3:

This chapter gives the view about the background of project, the involvement of companies, the proposed plan and the methodology used.

Chapter 4:

In this chapter, the general algorithm for designing a video conference application and also the steps involved in design of the software used for the analysis was given.

Chapter 5:

Analysis of the real time video transfer operation was done in this chapter through test conducted using the video conferencing application and various other software and hardware tools. The feasibility of this application in wimax network was also analysed.

Chapter 6:

This chapter will conclude the work and it will also contain the suggestions for the future work.

Page 10: Video

3

2 Background of Real Time Communication

2.1 Sockets, Ports and streaming

2.1.1 Sockets

In order to initiate the communication between the computer devices a communication channel or path should be established between the communicating nodes. This channel was provided by the Sockets. In the application layer the sockets were identified with the port number assigned for the process. The transmission of data to a particular computer node was done with the help of their IP address.[2]

Socket uses IP address for identifying the receiver node. The transmission of data to more than one receiver will require a server architecture in the application layer which manages and controls all the communication. An alternative to this is to utilize multicast IP address which uses class D addresses which can be used for multicast group communication. All the members in the group will have the same multicast IP address. It can be used in real time broadcasting of video.[2]

2.1.2 Ports

Ports are the 16 bit numbers governed by the computer operating system and are used for representing the different processes running in the system. In the case of networked communication the connection establishment between the processes running in different computer will be made with the help of these port numbers which are unique to a particular process.[2]

2.1.3 Streaming

Streaming is the process of direct delivery of the media data from the source to the destination without any intermediate storage in the real time. The source may be a capture device or a media server which will provide the media content on demand in real time through an Internet or any other networked connections. In order to meet the bandwidth requirement of communication network the streamed contents will be compressed before delivery. The data streams delivered to the end node will be played by a suitable player. This process is different from the download process which will actually save the data to the local memory space. This streaming process was applied in Video conferencing applications, Video on demand provided by some websites and in e-learning materials providing some useful video or audio contents along with the study materials.[3]

2.2 Encoding and Packetization

2.2.1 Encoding

Encoding is the process of converting the video signal got from the capture device or any other source to the streaming format. The first step is to convert the media source into an intermediate format which can be processed by the Computer processor. Then size of the data should be reduced by compression and scaling in order to meet the bandwidth requirement of the transmission medium. The data compression is the way to reduce the size of data by

Page 11: Video

4

removing the redundant information which will not contribute much to the visual perception of the media content by human eyes. There are two kinds of compression, Intraframe and Interframe compression. Intraframe compression is a lossy type system which involves the compression inside a single frame. Interframe compression is the method which detects the difference in the motion of the frames and sends only the changing Information thereby reducing the data size. Another way to reduce the data size is by scaling the media content. The spatial scaling can be done by reducing the Video frame size and temporal scaling was done by dropping the frames which are less important for the human perception. Another method of scaling is to reduce the colour resolution. All the above methods will considerably reduce the quality of the received media content. But still the applications like video conferencing will involve only few motions and hence the dropping of frames will not create much difference in visual perception.[3].

Two methods of coding, fixed rate or source coding and scalable or progressive coding can be used depends on transmission channel requirement. In fixed rate coding method the whole video data was compressed according to the target bandwidth and data size requirement. In Scalable coding method the video data is divided based on the various level of importance and after the coding they will be arranged in an order based on their importance level. A latest coding standard called MPEG-4 uses scalable coding method. Here video data will be divided into base layer which contains the data with basic information necessary for minimum acceptable perception of the visual content and enhancement layer which contains the extra information necessary to enhance the quality of the video content. The base layer requires only minimum bandwidth requirement and it will be given first priority during the transmission. Depending on the availability of channel bandwidth the information in the enhancement layer will be sent along with the base layer to enhance the performance of the video content.[4][5]

2.2.2 Packetization

Packetisation is the process of splitting the compressed data file into IP packets suitable for the transmission of the data in IP networks. Then the IP packets should be encapsulated with suitable protocol headers to govern the communication. The transmitted packets may reach the destination node at different order and hence a suitable transport protocol should be used to deliver the packet in right order. For real time video communications which does not require a reliable data service, User Datagram Protocol (UDP) can be used which will send the packets as a datagram and there is no guarantee for the reliable delivery of packets. If a reliable service was required then Transmit Control protocol (TCP) can be used which will acknowledge each and every packet received.[3]

2.2.3 Error Correction

As the video data packets are in compressed state any loss of packets and error will have a major impact on the retrieval of video at the receiver. Hence error handling mechanisms can be included in the application layer. The most commonly used error handling techniques are Forward Error Correction (FEC) and Automatic Repeat Request (ARQ). In Automatic Repeat Request method, a error checking sequence is sent along with the video data packets. At the receiver side the packets are checked with the help of this error check sequence and an acknowledgement was sent to the transmitter and in the case of any error detection the request to retransmit the particular video packet was sent to the transmitter. The acknowledgement sent here will introduce additional

Page 12: Video

5

overhead in transmission time and hence it is less suitable for real time communications which are highly delay sensitive. In Forward Error Correction method the video data packets were sent along with parity bits essential for error correction. In the receiver side the error will be corrected based on the parity bits. The transmission delay in the form of acknowledgement was avoided in this method but still there is no full guarantee for the delivery of error free data packets. A combination of both FEC and ARQ called Hybrid ARQ method can be used. Here the video data are divided into group of packets and each group is FEC encoded with parity bits. Instead of single packets, the group of packets with data were sent first and then parity bits will be send till it gets acknowledge from the transmitter or till the deadline for the transmission was reached. The process will then be repeated with next group of packets. Both time efficiency and reliability of packet delivery will be taken care in this method.[4] [5] [6]

2.3 Protocols for Real Time Communication

As the real time communication needs a delay free delivery of data, appropriate protocols specific to real time applications should be used in delivering the data. Some of those protocols are given below.

2.3.1 Real time Transport Protocol

It is a data delivery service protocol used by the applications which requires a real time transmission of data. These applications usually use the RTP protocol along with UDP. By use of multicasting the RTP can support the data delivery to multiple recipients. The RTP will not guarantee the reliable delivery of the data packets but it will provide the timing information required by the receiver in reconstructing the data packets like playing the video content in real time for conferencing applications. It will also provide the synchronised information regarding the source of the data packets and also the information about all the sources contributing to the real time packet transfer in a single session.

Real time Transport Control protocol (RTCP) was used along with RTP. The control of RTP session and the management information like packet loss, bit rate of the stream and participant information were done by RTCP. It will be used to monitor and control the quality of the session. [3][7]

2.3.2 Resource Reservation Protocol

It is used by the applications to request the routers for reservation of the bandwidth for special purposes. For real time applications the reservation of network resource is important to ensure the timely delivery of data packets to the destination. [3]

2.4 Overview of Wimax technology

High data rate and improved capacity are vital in the real time transmission of huge size data like video. Wimax is a wireless transmission technology which can provide those high data rates.

WiMAX stands for Wireless interoperability for Microwave access. It is based on IEEE 802.16 standard and it provides high speed broadband service to the end users and aims to eliminate the cables. It covers the distance of about 40 to 50km and support the data rate of about 70Mbps. [8] [10]

Page 13: Video

6

2.4.1 Layers of IEEE 802.16

IEEE 802.16 defines the standard for the Physical and data link layer which are the bottom down layers in the OSI model. The physical layer is subdivided in to physical and transmission layer. The transmission layer governs the functions such as encoding/decoding, synchronization; Bit transmission/reception and the physical layer have the frequency band and transmission medium specifications. Above the physical layer is Medium Access Control layer (MAC) which is responsible for providing the access to the medium and it includes the services such as framing of data with address and error detection fields. Above the MAC layer is the convergence layer which formats the data field from the upper layer in order to make it fit in to the lower layers. It includes services such as encapsulation of Protocol data unit(PDU) and address mapping. In addition the 802.16 standard must support the services such as Digital audio/video multicast, Digital telephony, ATM, Internet Protocol, Bridged LAN, Back haul and Frame relay.[8]

Figure 1.2. IEEE 802.16 Protocol Architecture (Taken from [10])

2.4.2 Use of MIMO technology and OFDM

Multiple Input Multiple output (MIMO) uses multiple antennas at the transmitter and receiver side. It can be designed either to enhance the diversity gain or capacity gain [11].There are two types of MIMO namely Single user MIMO which transmits several symbols by a single user and Multi user MIMO where the radio resource is shared by a number of users [12]. At first the data is modulated based on a powerful modulation technique like QAM,QPSK to produce several separate data streams which is then mapped in to separate antenna element. Then after the frequency up conversion it will be transmitted in wireless interface. In the receiver side it is down converted and received by multiple antennas to get back the original information. As several data transmissions takes place at same time by means of multiple antenna there is a significant improvement of the throughput in the overall transmission [13].

Orthogonal Frequency division Multiplexing (OFDM) is a popular modulation technique used in Wimax technology. OFDM uses the division of single frequency selective carrier in to a number of sub carriers for the wireless transmission. In order to maintain the orthogonality between the subcarriers, it maintains a minimum frequency separation between them and also there will be a overlap between those frequencies. This improves the Bandwidth efficiency of the system. The problems like Intersymbol Interference was highly avoided in the OFDM based transmissions. As this technique uses narrowband carriers the frequency selective fading was highly avoided. [11]

Page 14: Video

7

3 Project Background and Proposed Plan

3.1 Project Background

This work was done as part of the pilot projects for the Wimax forest project which is the major deployment of the wimax network in the Djanology school and Hadden park High school in the Nottingham city which enables these schools to get connected to various companies and Institutions for the educational purposes like e learning and providing learning materials to the students in the network .The companies like Active Ingredient, Hot knife Digital media and organisation like Nottingham Trent University are involved and they are one of the potential users of this project.

The initial phase of the project was to implement some pilot projects to test the sustainability of this wimax network. The Synetrix Company who is also a part in this project has taken the responsibility of deployment of the network. Initially they tried to deploy the network in the school connecting it with an cultural centre which is nearly a mile distance from the school. So prior to the deployment a radio analysis in the area was planned by the company. But the process was delayed due to some reasons and hence the deployment of wimax network was also delayed.

Providing e learning access to the students through this network may involve live interactions of the faculties with students through video conferencing and live demonstrations through video broadcast which are the effective ways of learning. Hence the design and analysis of such a real time communication system which involves a huge transfer of video data to the subscribers in the network in real time would be much useful to this project.

3.2 Proposed Plan

The main idea in this work is to design a video conference system which will work in the local and wide area networks and analyze the transfer of video in real time with the help of this application and various other software and hardware tools.

Initially the application was planned to be tested in a wimax network to analyze its capability but due to the unexpected delay in the deployment of the network, the video conference application and other video analysis were done in a wifi network which have almost similar characteristics like wimax network and results obtained were used to predict the capability of a wimax network through the theoretical analysis.

3.3 Methods followed

The basics for designing a real time video communication were studied through the literature survey.

To understand the socket communication a chat program which uses multicast IP address was studied using examples from the book sources.[2]

With the help of the case studies done regarding the real time communication the general algorithm to design a video conference system was done with a clear description of all the requirements.

Page 15: Video

8

Then by using Java media Frame Work (JMF) API available in Java platform which has special features related to handling media contents and devices, a design of software program to detect the capture device and capture video and send it to another device using Real time transfer protocol was tried with the help of some online examples provided by sun micro systems website [14]. But only the capture device detection and capturing of video was done as the processor needs proper codec for supporting the RTP format used for the transmission purpose.

The C sharp language platform which can use the features and components in the windows operating system was tried to design the conference system with a help of an online open source example [1]. Even though this platform was constrained to be used only in the windows operating system the design is simple and it was found much useful for the analysis purpose.

To test the ability of the network to handle the large size data transfer in real time which will actually take place during the conference, the Windows meeting space software was used. It is the software available in the windows vista operating system which has the feature to share the desktop or the applications running in the desktop among the other computer nodes available in the same network was used. The webcam application was run in one computer and shared with other computer nodes in the network and by variation of various factors the transfer of the media content in real time was tested.

The wireshark software was used to to capture the data packets and the obtained statistics of the packet transfer was used for the analysis.

All the tests were done in a Wifi network of two type, a point to point ADHOC connection between two computers and shared Network using IEEE 802.11n Router/Access point which uses MIMO technology. A wifi network was used as it is somewhat similar to the wimax technology. Tests were also carried out in a Wired network connection between two computers which will provide the data rate of about 100Mbps which is almost similar to the data rate envisaged to be supported by the wimax network. The research was done to analyze the layers of wimax. With the help of the practical data collected in the experiment and through its analysis the feasibility of this real time application in a wimax network was discussed.

Page 16: Video

9

4 Algorithms and Software Description

4.1 Algorithm

By the use of tools essentials for the design of a real time video conference system identified with the aid of the case studies done in chapter 2, the general algorithm to design a real time video conference system was proposed in this section. This general procedure can be used as a core idea in developing such applications in any platform.

4.1.1 Transmission Node:

4.1.1.1 Application Layer:

The first step in video conferencing is to capture the live video using a capture device. The capture device should be a Video Camera that will output a media content which could be supported by the Computer Operating System. The media content should be converted to an Intermediate format which can be processed by the computer.

In the Encoding step, the video data in the Intermediate format should be compressed and scaled to reduce the size of data using suitable standards without major reduction in the quality. This step is appropriate for meeting the data rates supported by the transmission medium.

The encoded data content should be broken in to packets of suitable size which is necessary for transmission of the data in the Networked environments. These packets could be encapsulated with the transmission protocol headers specific to real time communication to distinguish it from other data packets and each packet should be included with a sequence number to reorder it properly during reception.

Figure 1.4 Transmission node of Application layer

4.1.1.2 Transmission Layer:

In order to establish the connection between the nodes participating in the conference, the communicating socket should be initialised with port number for the process identification and type of transmission protocol for the governing the communication and the IP address of the receiver node to detect it among the other nodes in the Network.

The final step is the transmission of the data packets using a suitable physical medium. Before transmission the data packets should be modulated using appropriate technique which best fits the physical mode of transmission.

Capture Device

Intermediate Format

Encoding Packetization

Page 17: Video

10

Physical medium

Figure 2.4 Transmission layer

4.1.2 Reception mode:

First step is to receive the data packets intended to the node. So the communicating socket for the connection establishment consisting of IP address of the receiver node and port number for the process identification should be properly identified.

After the initial stage of identification, the receiver node should receive the data and demodulate it. Then the reconstruction of informative data should be done by ordering the packets based on their sequence number.

Physical medium

Figure 3.4 Reception layer

Then the decoding of the data should be done to get the actual video content. This video content should be processed by the computer processor and the suitable player for playing the content should be identified and the video content should be sent to it for playing the video lively.

Figure 4.4 Reception node Application layer

Modulation and Transmission

Socket Initialization

Port Number Transmission Protocol IP Address

Reception and Demodulation

Socket Identification

IP Address Transmission Protocol Port Number

Reconstruction Decoding Processor Player

Page 18: Video

11

4.1.3 Communication

Figure 5.4 A video conference system

In order to set up the real time video conferencing between the computers nodes each node should have both transmission and reception mode operations in order to have a full duplex communication. For setting up a conference between more than two computers a server application might be needed to manage the IP addresses and port numbers of the destination nodes and to have a central control over the conference session.

4.2 Software Description

A video conferencing system was designed with the help of an open source example [1]. This application was implemented using C Sharp platform which can use the features and components in the computers which are using windows operating systems.

Figure 6.4 Video Conferencing application

Tx

Rx

Tx Rx

Tx

Rx

Server

Tx –Transmission mode

Rx- Reception mode

Page 19: Video

12

4.2.1 Application Description

This application will capture the image in front of the capture device using the features of the webcam component attached to the computer and transfer the captured image in real time to the same application running on the intended computer in the same network by use of the sockets mentioning the receiving nodes IP address and port number specific to the process. By continuous writing of the image to the picture box in receiver computer the motion of the video was perceived. The basic steps involved in the design of this application were given below.

4.2.2 Initialization

The socket was initialized with the Network IP set as IPV4 and the socket type defined as datagram. The transmission protocol was assigned as UDP. The timer to capture the image was set as 100 milliseconds. The class Webcam_Capture uses the capturing features of the webcam capture device attached to the computer.

4.2.3 Transmission node

1) Get the Image from the Webcam Capture Component. Store the Image data as a stream.

2) Display the capture image by writing the stream to the picture box.

3) Store the Image stream in a buffer.

4) Establish a socket connection and connect to the remote client with their IP address and the port number assigned for the process.

5) Form a new Binary writer and write the Image stream in to the binary writer in the mentioned address.

6) Flush the contents after writing to allow the next stream to be written.

7) Repeat the steps from beginning and continue till the end of Conference.

8) Close the connection at the end of the session.

4.2.4 Reception node

1) Open the Port assigned for the communication process and start listening in that port for the incoming data.

2) If any connection request comes, accept that connection and start receiving the stream of data from the particular port.

3) Display the Image stream by writing it to the picture box.

4) Once the Image was written to the picture box, it should be flushed and session should be closed.

5) Continue the process from the first step and receive the next stream till the termination of the connection establishment.

4.2.5 Video Conference between two nodes

To setup conference between two communicating devices, both the transmission and reception node should be included in the application and run on

Page 20: Video

13

both devices. Separate ports should be assigned to the transmission and reception of the video in a node. The transmitting port in one node should be same as the receiving port in opposite side and vice versa.

4.2.6 Conference between multiple nodes

To setup communication between multiple nodes the Receiver node should be increased and separate ports should be assigned to each node. In sender node the copy of stream should be sent to all the receiving nodes taking part in the communication by indicating their IP address.

Page 21: Video

14

5 Experiments and Result Analysis

The different scenarios in a video transmission was tested between two computers using Windows Meeting Space Software available in Windows Vista operating system and also using the Video conferencing software modified from a open source[1] explained in the previous chapter.

The applications are tested in a Wireless network setup by using a Router (connection with Internet and thereby congested by download operation and by shared use of the network by many users) which is using IEEE802.11n WIFI standard, a wireless Adhoc network(free of congestions) setup between two computers and a Wired Ethernet connection(free from interference due to air interface) between two computers.

5.1 Windows Meeting Space:

5.1.1 Experimental Setup

Two computers having Windows Vista Operating system is used. The different type of network connections were setup between the computers and the webcam application in a computer is shared between two computers by using the Windows Meeting Space Software to test the network’s capability to handle the bulk size video transmission in real time. The video data size was varied by adjusting the resolution size in the webcam and the delay in the reception of the image in another computer was noted. Using wire shark software the average data rate of the three networks for two resolution sizes were noted.

Resolution

IEEE 802.11n LAN

(Delay in seconds)/Data rate(Mbps)

ADHOC

(Delay in seconds) /Data rate(Mbps)

Ethernet

(Delay in seconds)/Data rate(Mbps)

160 x 120 33 0.55 Below 1 1.5 Real time 1.42

800 x 600 49 7 1

1600 x 1200

66 0.45 10 9.8 2 to 3 15.5

Table 1.5 Test results of Real time video transfer using Windows meeting space

5.1.2 Real time video transfer Analysis

5.1.2.1 IEEE 802.11n Wireless Local area Network

As the network is congested with the traffic created by download operation carried by some nodes which are part of this network, an immense delay in the reception of video motion was observed. A very low data rate was observed in the network due to the congestion and limitations in capabilities of network. A considerable increase in delay was observed with the increase in resolution size and lot of video frames are dropped and only very few frames reached the other node and hence no motion of video was perceived in the receiver node.

Page 22: Video

15

5.1.2.2 ADHOC Wireless Network

The network congestion was reduced by the point to point ADHOC connection between two computers but due to interference caused by the air interface which is used as the transmission medium a small amount of delay was observed. At low resolution the delay was lesser than a second. A moderate or little delay was observed with the increase in resolution size. The motion of the video was observed in the other node at low resolution size with few dropping of frames. But a considerable number of frames were dropped as the resolution increases. The data rate observed was optimum and it increases with increase in size of data as the resolution increases.

5.1.2.3 Ethernet Wired Network

The network congestion and the interference due to the air interface were avoided by the wired Ethernet. The clear motion was perceived at the other end without delay. A very low delay of about 1 or 2 seconds was noticed with the increase in resolution size. The data rate is also optimum here and it is better with increase in size of data.

5.2 Video Conferencing Software:

5.2.1 Experimental Setup:

The Video conference application was run on two computers and the video was sent from one node which acted as a transmitter and received at another end. The delay in receiving the video was noted in different networks with two different frame sizes. In order to reduce the data size, frame size was chosen as minimum. The data traffic in the network was also analysed by capturing the packets using Wireshark Software.

Frame size

IEEE802.11n LAN

(Delay in seconds)

ADHOC

(Delay in seconds)

ETHERNET

(Delay in seconds)

320 x 240 2 to 3 Real time Real time

176 x 144 Real time Real time Real time

Average Packets/second(Bytes)

57 104 105

Average Packet size 769 801 912

Average Data rate(Mbps)

0.355 0.67 0.766

Table 2.5 Results of Video conferencing application test

Page 23: Video

16

5.2.2 Analysis of Video conferencing application test

5.2.2.1 IEEE 802.11n Wireless Local area Network

A small amount of delay was noted with the frame size of 320 x 240.The established connection between the two nodes taking part in the conference was failed when the network was congested with a download operation from internet. But at the frame size of 176 x 144 under same condition no disconnection was observed even though few of the frames are dropped due to congestion.

5.2.2.2 ADHOC Wireless Network

Even though the reception of video was almost done in real time, due to the air medium few frames are dropped occasionally and hence the motion was stopped for some period of time (3 to 5 seconds).

5.2.2.3 Ethernet Wired Network

A real time reception of video was observed in the receiver node without dropping of frames because of the absence of the interference due to air medium.

5.2.2.4 Packet Analysis

The captured packets in all the three type of networks shows that the packet flow in the IEEE 802.11n network which is using the router as the central access point was less compared to the other networks used which have direct point to point connection between the communicating nodes. This might be partially due to the fact that all the packets has to be directed through the router in order to reach their destination node which delay the packet flow in the network.

The dropping of data packets in congested state or error in the incoming packet reduces the quality of the perceived video and hence suitable error control techniques which will not create much disturbance to the real time transmission should be followed according to the network condition. Inorder to assure a timely play back of video at receiver end, the real time transfer protocol can be implemented in this application.

5.2.3 Quality Test with Different Formats

The video motion was created here in the conferencing system by the continuous writing of frame of pictures each defined in JPEG format on to the picture box. The variation in delay and quality of the video was observed in the congested network by changing the picture formats which is given in the table.

Picture Formats

Delay/Quality of the video

Gif Low delay(4 seconds)/Low quality

PNG Medium delay(43 seconds)/moderate quality

TIFF High delay(70 seconds)/good quality

Table 3.5 Results of Quality test

Page 24: Video

17

5.3 Video Size test

Using a webcam, the videos each of length 25 seconds with different frame sizes and with and without compression (MJPEG) was recorded and sizes of each video were noted as given in the table. The AVI file format was used in this test.

Compression Method

Frame Dimensions

Memory Size(Kilo Bytes)

Data Rate(Kilo Bytes per Second)

MJPEG 640 x 480 22764 910

MJPEG 320 x 240 4370 175

MJPEG 176 x 144 1885 75

No Compression 640 x 480 136040 5441

No Compression 320 x 240 34115 1364

No Compression 176 x 144 11314 452

Table 4.5 Results of video size test

From the table it is evident that the size of the video reduces by six to nine times with the help of compression. But the quality of the video is reduced with compression technique. A reduction in size of three to five times was observed, if the Frame size was scaled by two or three times.

Thus the suitable compression and scaling technique should be chosen according to the available channel bandwidth before sending the real time data packets to have an efficient mean of transmission. The scalable video coding discussed in the chapter 2 was found to be the best option for this type of application.

From all the above experiments it is evident that timely delivery of the video data was affected by the congestions in the network, limitations in the data speed provided by the network. The quality of the video has to be compromised according to the characteristics of the network used for the communication of the real time video data. The results obtained while using the wired network was better compared to those obtained on the wireless network. In next section the feasibility study of these applications in wimax network which is a wireless network with better features similar to a wired network was discussed.

5.4 Feasibility Of the Application in a Wimax Network

5.4.1 Application layer

From the experimental results given in the previous chapter it is evident that the congestion created in the Wifi network by access of the network resources by other processes will reduce the performance of real time applications as there is no priority mechanism in this network for real time data transfer applications.

According to the IEEE 802.16 standard specifications wimax will provide Quality of service specific to real time applications. With the use of Real Time Polling Service (rtPS), the subscriber station can get the access of medium for specific length of time if it wants to send the data packets in real time [9]. In

Page 25: Video

18

addition it will also provide special service for audio/video broadcast from base station to the subscriber station [8]. The transport protocols like Real time Transfer protocol and Resource Reservation protocol can be implemented in the application to distinguish the real time data packets from the normal data transfers and to govern the real time communication in an efficient way.

5.4.2 Medium Access Control layer

For time critical packet transfers a Wimax standard will support the Medium Access Control payload rate of 32 Kbps to 1.5 Mbps with the maximum delay of 10 milli seconds [8]. So unlike Wifi standard which does not provides any special QOS mechanism for the access of medium for the time critical packets, the video packets which are time critical in a video conference application will be given preference to access the medium in a wimax network and hence we can expect a better performance of this application in wimax.

5.4.3 Physical Layer

From the experiment it was noted that the interference caused by the air interface in a Wifi network degraded the performance of the video transmission. Hence the physical medium of transmission plays a major role in the performance. Wifi standard uses the frequency ranges 2.4GHz and 5GHz which are licence exempt bands [8]. Hence there is a high chance of interference from other transmitting devices which are working under same frequency like microwave ovens. The physical data rate of 54 Mbps was provided by most of Wifi standards which is very important in real time transfer of huge size data like video. The wifi network used in the experiment which is based on IEEE 802.11n standard and uses MIMO antenna technology may provide the date rate of up to 100Mbps [8]. The coverage area is limited to few hundred meters in the Wifi network.

Wimax uses MIMO antenna technique for transmission and OFDM method for modulation. These two techniques together will improve the capacity, throughput and reduce the intersymbol interference. Hence more number of users can take part in the video conference at same time. According to the IEEE 802.16 standard specification Wimax network can be operated in the frequency ranges from 2 to 11GHz and 10 to 66GHz. Hence this dynamic range of Bandwidth allows the choice of choosing the transmission frequency which is free from interference. The wimax can provide a data rate of 70Mbps. The IEEE 802.16 standard specifies the supported data rate to be in the range of 32 to 134 Mbps which is ideal for the real time video communication. [10]

Page 26: Video

19

6 Conclusion and Future work

6.1 Conclusion

The various constraints in the real time video communication like delayed reception of video, data traffic congestion, reduction in quality due to compression and scaling were identified and tested practically using the video conferencing software tool. The general design algorithm proposed will be much useful in designing such kind of real time conference systems with added features.

The video conference system used here was designed on C sharp platform which will run on the devices with windows operating systems. Video motion is created here by continuous writing of captured images from the webcam to the picture box in the application running in the other computer node which is taking part in the communication.

The optimum performance was observed while using this application in wired networks which are free from interference due to air interface and can provide a high data rate. Since wimax technology can also provide similar kind of data rates and good quality signals a similar performance can be expected from this technology in using this application.

The transfer of video data in a networked environment requires a very high data rate and substantial bandwidth. The wimax network can provide a wireless network connection with a data rate of 70Mbps or even more depending on the implementation method and the environmental condition. According to the IEEE 802.16 standard specification for wimax, it can use the frequency range from 2 to 11GHz and 10 to 66GHz. So there is flexibility in the choice of wide range of bandwidth for the efficient way of communication. Hence this technology is best suited for these kinds of real time applications.

The analysis, methods used and the research work and the results obtained in this work will be much useful in the future design of the real time communication systems which will transfer large size of data like video in Local or wide area Networks. The theoretical analysis of the capability of the wimax network in handling these kinds of real time transmissions will be helpful in use of these applications in this network after their deployment in future.

As this work is a part of Multidisciplinary project which involves various companies and institutions and their potential customers, a valuable experience of working for a real time problem was gained. By working along with the people from different backgrounds, good team work skills and ability to work in a real time environment were developed.

6.2 Future Work

The capabilities of the wimax technology to handle these kinds of real time data transfers can be tested and analyzed practically in a wimax network which will be useful in the future commercial deployment of these networks

The video conferencing system used here for the analysis used one way (half duplex) communication of real time video data from one computer node to another node. Thus the analysis can be extended to test the two way (full

Page 27: Video

20

duplex) communication. A video conference system to communicate between multiple computer nodes can be designed by increasing the reception node in the application and the similar kind of analysis can be done in this system to test the capacity of network in real time video transfer operations.

The video conferencing design solution used here will be utilized only in the computers having windows operating system as it is designed in C sharp platform. A more useful design of similar kind of conferencing system can be developed in Java platform using Java Media Frame Work (JMF) [14] with the implementation of Real time Transfer protocol so that it can be tested and used in other operating systems and also can potentially be used in future mobile phones with wimax network access capability.

Page 28: Video

21

References

1] Fadi Abdelqader ,April 2008,“P2P Voice/Video Conferencing example”, [online] Availabe at : <http://socketcoder.com/ArticleFile.aspx?index=2&ArticleID=44> [Accessed 7 August 2009].

2] Marko Boger,2001,Java in Distributed Systems,John Wiley and sons Ltd(Translated Edition),pages 50-51,64-68,355-356.

3] David Austerberry,2005, “The Technology of Video and Audio Streaming,” Focal press, second edition, pages 7-9,133-148,154-177.

4] A. Majumdar, D. Sachs, I. Kozintsev, K. Ramchandran, and M.Yeung, June2002 ,“Multicast and unicast real-time video streaming over wireless LANs,” IEEE Trans. Circuits Syst. Video Technol., vol. 12, pp. 524–534, [Online], Available via: Google Scholar[Accessed 1 August 2009].

5] S. Krishnamachari, M. van der Schaar, S. Choi, and X. Xu, April 2003, “Video streaming over wireless LANs: A cross-layer approach,” in Proceedings of Packet Video Workshop, Nantes, France, [Online], Available via: Google Scholar[Accessed 3 August 2009].

6] Y. Shan and A. Zakhor, Aug. 2002. ,“Cross layer techniques for adaptive video streaming over wireless networks,” Proc. IEEE Int. Conf. Multimedia and Expo (ICME), pp. 277–280, [Online], Available via: Google Scholar[Accessed 3 August 2009].

7] H. Schulzrinne, S. Casner, R. Rrederick, and V. Jacobson, July. 2003, “RTP: A transport protocol for real-time application,” RFC 3550, Internet Society and IETF [online], Available via: Google Scholar [Accessed 3 August 2009]

8] William Stallings, 2008, Wireless Communications and Networks, Eastern Economy Edition, Prentice Hall of India(Reprint),pages 342-354,442,452.

9] Kitti Wongthavarawat and Aura Ganz, 2003, "Packet scheduling for QOS support in IEEE 802.16 broadband wireless access systems", Int. J.Commun. Syst., vol.16, pp.81-96, John Wiley and sons, Ltd., [online], Available via: Google Scholar [Accessed 3 August 2009].

10] Sanida Omerovic, "Wimax Overview", Faculty of Electrical Engineering, University of Ljubljana, Slovenia[Online],Available via: Google Scholar[Accessed 15 August 2009].

11] GORDON L. STÜBER, et al.,2004 February, Broadband MIMO-OFDM Wireless Communications, Proceedings of IEEE, Volume 92,No 2[online],Available via: Google Scholar[Accessed 15 may 2009].

12] Ari Hottinen, et al., August 2006,Industrial Embrace of Smart Antennas and MIMO,IEEE Wireless Communications[online],Available via: Google Scholar [Accessed 15 may 2009].

13] Jun Wang, et al.,9 may 2008, "Joint bandwidth allocation, element assignment and scheduling for wireless mesh networks with MIMO links”, Computer Communications pages 1372-1384,[online],Available via: Science direct[Accessed 16 may 2009].

14] “Java Media Framework API Guide” [online], Available at: <http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/guide/index.html> [Accessed 25 july 2009].

Page 29: Video

22

Appendices

The source code for the video conference system in C sharp platform used in this work which is modified from an open source example ,“P2P Voice/Video Conferencing example”, available in [1] authored by Fadi Abdelqader.

using System; using System.Drawing; using System.Collections; using System.ComponentModel; using System.Windows.Forms; using System.Net; using System.Net.Sockets; using System.Threading; using System.IO; using System.Data; namespace Conference_System { /// <summary> /// Summary description for Form1. /// </summary> public class Form1 : System.Windows.Forms.Form { private System.Windows.Forms.Label label2; private System.Windows.Forms.TextBox text_IP; private System.Windows.Forms.PictureBox pictureBox1; private System.Windows.Forms.PictureBox pictureBox2; private System.Windows.Forms.Label label4; private System.Windows.Forms.Label label5; private System.Windows.Forms.Label label6; private System.Windows.Forms.Label label7; private System.Windows.Forms.Button button3; private System.Windows.Forms.Button button4; private WebCam_Capture.WebCamCapture webCamCapture1; private System.Windows.Forms.TextBox text_Camera_rec_port; private System.Windows.Forms.Timer Capturing; private System.Windows.Forms.TextBox text_Camera_send_port; private WebCam_Capture.WebCamCapture WebCamCapture; private System.Windows.Forms.LinkLabel linkLabel1; private System.ComponentModel.IContainer components; public Form1() { // // Required for Windows Form Designer support // InitializeComponent(); r = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp); } /// <summary> /// Clean up any resources being used.

Page 30: Video

23

/// </summary> protected override void Dispose( bool disposing ) { if( disposing ) { if (components != null) { components.Dispose(); } } base.Dispose( disposing ); } #region Windows Form Designer generated code /// <summary> /// Modified for Video analysis /// </summary> private void InitializeComponent() { this.components = new System.ComponentModel.Container(); this.text_IP = new System.Windows.Forms.TextBox(); this.label2 = new System.Windows.Forms.Label(); this.label4 = new System.Windows.Forms.Label(); this.label5 = new System.Windows.Forms.Label(); this.text_Camera_rec_port = new System.Windows.Forms.TextBox(); this.label6 = new System.Windows.Forms.Label(); this.label7 = new System.Windows.Forms.Label(); this.button3 = new System.Windows.Forms.Button(); this.button4 = new System.Windows.Forms.Button(); this.Capturing = new System.Windows.Forms.Timer(this.components); this.text_Camera_send_port = new System.Windows.Forms.TextBox(); this.WebCamCapture = new WebCam_Capture.WebCamCapture(); this.linkLabel1 = new System.Windows.Forms.LinkLabel(); this.pictureBox2 = new System.Windows.Forms.PictureBox(); this.pictureBox1 = new System.Windows.Forms.PictureBox(); ((System.ComponentModel.ISupportInitialize)(this.pictureBox2)).BeginInit(); ((System.ComponentModel.ISupportInitialize)(this.pictureBox1)).BeginInit(); this.SuspendLayout(); // // text_IP // this.text_IP.Location = new System.Drawing.Point(104, 0); this.text_IP.Name = "text_IP"; this.text_IP.Size = new System.Drawing.Size(256, 20); this.text_IP.TabIndex = 10; this.text_IP.Text = "192.168.2.2"; this.text_IP.TextChanged += new System.EventHandler(this.text_IP_TextChanged); // // label2 // this.label2.Font = new System.Drawing.Font("Microsoft Sans Serif", 9.75F, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point, ((byte)(178)));

Page 31: Video

24

this.label2.Location = new System.Drawing.Point(16, 0); this.label2.Name = "label2"; this.label2.Size = new System.Drawing.Size(88, 24); this.label2.TabIndex = 15; this.label2.Text = "IP Address"; // // label4 // this.label4.Location = new System.Drawing.Point(24, 40); this.label4.Name = "label4"; this.label4.Size = new System.Drawing.Size(128, 16); this.label4.TabIndex = 18; this.label4.Text = "My Camera"; // // label5 // this.label5.Location = new System.Drawing.Point(256, 40); this.label5.Name = "label5"; this.label5.Size = new System.Drawing.Size(128, 16); this.label5.TabIndex = 19; this.label5.Text = "Remote Camera"; // // text_Camera_rec_port // this.text_Camera_rec_port.BorderStyle = System.Windows.Forms.BorderStyle.FixedSingle; this.text_Camera_rec_port.Location = new System.Drawing.Point(139, 254); this.text_Camera_rec_port.Name = "text_Camera_rec_port"; this.text_Camera_rec_port.Size = new System.Drawing.Size(104, 20); this.text_Camera_rec_port.TabIndex = 24; this.text_Camera_rec_port.Text = "8001"; // // label6 // this.label6.Location = new System.Drawing.Point(16, 256); this.label6.Name = "label6"; this.label6.Size = new System.Drawing.Size(128, 16); this.label6.TabIndex = 23; this.label6.Text = "Receiveing Video Port"; // // label7 // this.label7.Location = new System.Drawing.Point(16, 232); this.label7.Name = "label7"; this.label7.Size = new System.Drawing.Size(128, 16); this.label7.TabIndex = 21; this.label7.Text = "Sending Video Port"; // // button3 // this.button3.FlatStyle = System.Windows.Forms.FlatStyle.Flat; this.button3.Location = new System.Drawing.Point(227, 116); this.button3.Name = "button3"; this.button3.Size = new System.Drawing.Size(144, 24); this.button3.TabIndex = 26;

Page 32: Video

25

this.button3.Text = "Stop Camera"; this.button3.Click += new System.EventHandler(this.button3_Click); // // button4 // this.button4.FlatStyle = System.Windows.Forms.FlatStyle.Flat; this.button4.Location = new System.Drawing.Point(227, 73); this.button4.Name = "button4"; this.button4.Size = new System.Drawing.Size(144, 24); this.button4.TabIndex = 25; this.button4.Text = "Start Camera"; this.button4.Click += new System.EventHandler(this.button4_Click); // // Capturing // this.Capturing.Tick += new System.EventHandler(this.Capturing_Tick); // // text_Camera_send_port // this.text_Camera_send_port.BorderStyle = System.Windows.Forms.BorderStyle.FixedSingle; this.text_Camera_send_port.Location = new System.Drawing.Point(139, 228); this.text_Camera_send_port.Name = "text_Camera_send_port"; this.text_Camera_send_port.Size = new System.Drawing.Size(104, 20); this.text_Camera_send_port.TabIndex = 27; this.text_Camera_send_port.Text = "8000"; // // WebCamCapture // this.WebCamCapture.CaptureHeight = 240; this.WebCamCapture.CaptureWidth = 320; this.WebCamCapture.FrameNumber = ((ulong)(0ul)); this.WebCamCapture.Location = new System.Drawing.Point(17, 17); this.WebCamCapture.Name = "WebCamCapture"; this.WebCamCapture.Size = new System.Drawing.Size(342, 252); this.WebCamCapture.TabIndex = 0; this.WebCamCapture.TimeToCapture_milliseconds = 100; this.WebCamCapture.ImageCaptured += new WebCam_Capture.WebCamCapture.WebCamEventHandler(this.WebCamCapture_ImageCaptured); // // linkLabel1 // this.linkLabel1.Location = new System.Drawing.Point(160, 24); this.linkLabel1.Name = "linkLabel1"; this.linkLabel1.Size = new System.Drawing.Size(104, 16); this.linkLabel1.TabIndex = 28; this.linkLabel1.LinkClicked += new System.Windows.Forms.LinkLabelLinkClickedEventHandler(this.linkLabel1_LinkClicked); // // pictureBox2 // this.pictureBox2.BackColor = System.Drawing.Color.DimGray;

Page 33: Video

26

this.pictureBox2.Image = global::Conference_System.Properties.Resources.video7; this.pictureBox2.Location = new System.Drawing.Point(505, 24); this.pictureBox2.Name = "pictureBox2"; this.pictureBox2.Size = new System.Drawing.Size(305, 250); this.pictureBox2.TabIndex = 17; this.pictureBox2.TabStop = false; this.pictureBox2.Click += new System.EventHandler(this.pictureBox2_Click); // // pictureBox1 // this.pictureBox1.BackColor = System.Drawing.Color.DimGray; this.pictureBox1.Image = global::Conference_System.Properties.Resources.video; this.pictureBox1.Location = new System.Drawing.Point(8, 56); this.pictureBox1.Name = "pictureBox1"; this.pictureBox1.Size = new System.Drawing.Size(183, 141); this.pictureBox1.TabIndex = 16; this.pictureBox1.TabStop = false; // // Form1 // this.AutoScaleBaseSize = new System.Drawing.Size(5, 13); this.ClientSize = new System.Drawing.Size(898, 376); this.Controls.Add(this.linkLabel1); this.Controls.Add(this.text_Camera_send_port); this.Controls.Add(this.button3); this.Controls.Add(this.button4); this.Controls.Add(this.text_Camera_rec_port); this.Controls.Add(this.label6); this.Controls.Add(this.label7); this.Controls.Add(this.label5); this.Controls.Add(this.label4); this.Controls.Add(this.pictureBox2); this.Controls.Add(this.pictureBox1); this.Controls.Add(this.label2); this.Controls.Add(this.text_IP); this.MaximizeBox = false; this.Name = "Form1"; this.StartPosition = System.Windows.Forms.FormStartPosition.CenterScreen; this.Text = "Video Conference System "; this.Load += new System.EventHandler(this.Form1_Load); this.Closing += new System.ComponentModel.CancelEventHandler(this.Form1_Closing); ((System.ComponentModel.ISupportInitialize)(this.pictureBox2)).EndInit(); ((System.ComponentModel.ISupportInitialize)(this.pictureBox1)).EndInit(); this.ResumeLayout(false); this.PerformLayout(); } #endregion

Page 34: Video

27

/// <summary> /// The code for the transmission and reception from [1]. /// </summary> [STAThread] static void Main() { Application.Run(new Form1()); } //*************************************************************// private Socket r; private Thread t; private bool connected = false; private byte[] m_PlayBuffer; private byte[] m_RecBuffer; TcpClient myclient; MemoryStream ms; NetworkStream myns; BinaryWriter mysw; Thread myth1; TcpListener mytcpl; Socket mysocket; NetworkStream ns; private void Form1_Load(object sender, System.EventArgs e) { myth1= new Thread (new System.Threading.ThreadStart(Start_Receiving_Video_Conference)); // Start Thread Session myth.Start (); // Start Receiveing Remote Camera } private void Start_Receiving_Video_Conference() { try { // Open The Port mytcpl = new TcpListener (int.Parse(text_Camera_rec_port.Text)); mytcpl.Start (); // Start Listening on That Port mysocket = mytcpl.AcceptSocket (); // Accept Any Request From Client and Start a Session ns = new NetworkStream (mysocket); // Receives The Binary Data From Port pictureBox2.Image = Image.FromStream(ns); mytcpl.Stop(); // Close TCP Session if (mysocket.Connected ==true) // Looping While Connected to Receive Another Message {

Page 35: Video

28

while (true) { Start_Receiving_Video_Conference (); // Back to First Method } } myns.Flush(); } catch (Exception){} } private void Start_Sending_Video_Conference(string remote_IP,int port_number) { try { ms = new MemoryStream();// Store it in Binary Array as Stream pictureBox1.Image.Save(ms,System.Drawing.Imaging.ImageFormat.Jpeg); byte[] arrImage = ms.GetBuffer(); myclient = new TcpClient (remote_IP,port_number);//Connecting with server myns = myclient.GetStream (); mysw = new BinaryWriter (myns); mysw.Write(arrImage);//send the stream to above address ms.Flush(); mysw.Flush(); myns.Flush(); ms.Close(); mysw.Close (); myns.Close (); myclient.Close (); } catch (Exception ex) { Capturing.Enabled = false; MessageBox.Show(ex.Message,"Video Conference Error Message",MessageBoxButtons.OK,MessageBoxIcon.Error); } } private void Form1_Closing(object sender, System.ComponentModel.CancelEventArgs e) { try { myth.Abort (); mytcpl.Stop (); ns.Flush(); ns.Close();

Page 36: Video

29

t.Abort(); r.Close(); myclient.Close (); ms.Close (); myns.Close (); mysw.Close (); } catch(Exception){} } private void button4_Click(object sender, System.EventArgs e) { this.WebCamCapture.TimeToCapture_milliseconds = 1; this.WebCamCapture.Start(0); Capturing.Enabled = true; } private void button3_Click(object sender, System.EventArgs e) { this.WebCamCapture.Stop(); Capturing.Enabled = false; } private void Capturing_Tick(object sender, System.EventArgs e) { Start_Sending_Video_Conference(text_IP.Text,int.Parse(text_Camera_send_port.Text)); } private void WebCamCapture_ImageCaptured(object source, WebCam_Capture.WebcamEventArgs e) { this.pictureBox1.Image = e.WebCamImage; } private void pictureBox2_Click(object sender, EventArgs e) { } private void linkLabel1_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e) { }

Page 37: Video

30

private void text_IP_TextChanged(object sender, EventArgs e) { } private void text_SendingPort_TextChanged(object sender, EventArgs e) { } private void button1_Click_1(object sender, EventArgs e) { } } }