Current standards recommend behind one-way 150 milliseconds.
Standards provide a maximum delay of one-way 400 milliseconds. The entire period consists of several components.
Codec delay
- Encoder/decoder or codec to the source voice to encode and decode determines the voice used algorithm.
The codec can introduce a delay of 10 to 50 milliseconds or more.
Delayed transmission
The transmission delay is necessary time signal with the speed of light at the spread. Usually, it's small, but on satellite circuit, it can of 250 milliseconds.
Insert time
Insert delay is the amount of time that the bits in the clock line is required. 8000 Bit Framework supports 800 microseconds at 10 megabits per second transferred. The same image takes 125 milliseconds on line transfer 64 kilobits per second.
Jitter buffer
Delay queue introduce variability in the delay and jitter. A jitter buffer should be used at the reception codec for the jig smoothing, recreated the tone up. A jitter buffer can add 80 milliseconds or longer for the delay.
Perception of the user
If the sum of all these delays are less than 150 milliseconds, users probably do not notice. When the delay between 150 and 400 milliseconds, users will notice the delay, but find the row still usable. A line with a delay of more than 400 milliseconds would be considered useless by many users.
No comments:
Post a Comment