draft-ietf-bmwg-sip-bench-meth-02.txt   draft-ietf-bmwg-sip-bench-meth-03.txt 
Benchmarking Methodology Working S. Poretsky Benchmarking Methodology Working C. Davids
Group Allot Communications Group Illinois Institute of Technology
Internet-Draft V. Gurbani Internet-Draft V. Gurbani
Expires: January 13, 2011 Bell Laboratories, Alcatel-Lucent Expires: September 15, 2011 Bell Laboratories, Alcatel-Lucent
C. Davids S. Poretsky
Illinois Institute of Technology Allot Communications
July 12, 2010 March 14, 2011
Methodology for Benchmarking SIP Networking Devices Methodology for Benchmarking SIP Networking Devices
draft-ietf-bmwg-sip-bench-meth-02 draft-ietf-bmwg-sip-bench-meth-03
Abstract Abstract
This document describes the methodology for benchmarking Session This document describes the methodology for benchmarking Session
Initiation Protocol (SIP) performance as described in SIP Initiation Protocol (SIP) performance as described in SIP
benchmarking terminology document. The methodology and terminology benchmarking terminology document. The methodology and terminology
are to be used for benchmarking signaling plane performance with are to be used for benchmarking signaling plane performance with
varying signaling and media load. Both scale and establishment rate varying signaling and media load. Both scale and establishment rate
are measured by signaling plane performance. The SIP Devices to be are measured by signaling plane performance. The SIP Devices to be
benchmarked may be a single device under test (DUT) or a system under benchmarked may be a single device under test (DUT) or a system under
test (SUT). Benchmarks can be obtained and compared for different test (SUT). Benchmarks can be obtained and compared for different
types of devices such as SIP Proxy Server, SBC, and server paired types of devices such as SIP Proxy Server, SBC, and server paired
with a media relay or Firewall/NAT device. with a media relay or Firewall/NAT device.
Status of this Memo Status of this Memo
This Internet-Draft is submitted to IETF in full conformance with the This Internet-Draft is submitted in full conformance with the
provisions of BCP 78 and BCP 79. provisions of BCP 78 and BCP 79.
Internet-Drafts are working documents of the Internet Engineering Internet-Drafts are working documents of the Internet Engineering
Task Force (IETF), its areas, and its working groups. Note that Task Force (IETF). Note that other groups may also distribute
other groups may also distribute working documents as Internet- working documents as Internet-Drafts. The list of current Internet-
Drafts. Drafts is at http://datatracker.ietf.org/drafts/current/.
Internet-Drafts are draft documents valid for a maximum of six months Internet-Drafts are draft documents valid for a maximum of six months
and may be updated, replaced, or obsoleted by other documents at any and may be updated, replaced, or obsoleted by other documents at any
time. It is inappropriate to use Internet-Drafts as reference time. It is inappropriate to use Internet-Drafts as reference
material or to cite them other than as "work in progress." material or to cite them other than as "work in progress."
The list of current Internet-Drafts can be accessed at This Internet-Draft will expire on September 15, 2011.
http://www.ietf.org/ietf/1id-abstracts.txt.
The list of Internet-Draft Shadow Directories can be accessed at
http://www.ietf.org/shadow.html.
This Internet-Draft will expire on January 13, 2011.
Copyright Notice Copyright Notice
Copyright (c) 2010 IETF Trust and the persons identified as the Copyright (c) 2011 IETF Trust and the persons identified as the
document authors. All rights reserved. document authors. All rights reserved.
This document is subject to BCP 78 and the IETF Trust's Legal This document is subject to BCP 78 and the IETF Trust's Legal
Provisions Relating to IETF Documents Provisions Relating to IETF Documents
(http://trustee.ietf.org/license-info) in effect on the date of (http://trustee.ietf.org/license-info) in effect on the date of
publication of this document. Please review these documents publication of this document. Please review these documents
carefully, as they describe your rights and restrictions with respect carefully, as they describe your rights and restrictions with respect
to this document. Code Components extracted from this document must to this document. Code Components extracted from this document must
include Simplified BSD License text as described in Section 4.e of include Simplified BSD License text as described in Section 4.e of
the Trust Legal Provisions and are provided without warranty as the Trust Legal Provisions and are provided without warranty as
described in the BSD License. described in the Simplified BSD License.
Table of Contents Table of Contents
1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4
2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 4 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 4
3. Test Topologies . . . . . . . . . . . . . . . . . . . . . . . 5 3. Benchmarking Topologies . . . . . . . . . . . . . . . . . . . 5
4. Test Considerations . . . . . . . . . . . . . . . . . . . . . 6 4. Benchmarking Considerations . . . . . . . . . . . . . . . . . 5
4.1. Selection of SIP Transport Protocol . . . . . . . . . . . 6 4.1. Selection of SIP Transport Protocol . . . . . . . . . . . 5
4.2. Signaling Server . . . . . . . . . . . . . . . . . . . . . 6 4.2. Signaling Server . . . . . . . . . . . . . . . . . . . . . 5
4.3. Associated Media . . . . . . . . . . . . . . . . . . . . . 7 4.3. Associated Media . . . . . . . . . . . . . . . . . . . . . 5
4.4. Selection of Associated Media Protocol . . . . . . . . . . 7 4.4. Selection of Associated Media Protocol . . . . . . . . . . 6
4.5. Number of Associated Media Streams per SIP Session . . . . 7 4.5. Number of Associated Media Streams per SIP Session . . . . 6
4.6. Session Duration . . . . . . . . . . . . . . . . . . . . . 7 4.6. Session Duration . . . . . . . . . . . . . . . . . . . . . 6
4.7. Attempted Sessions per Second . . . . . . . . . . . . . . 7 4.7. Attempted Sessions per Second . . . . . . . . . . . . . . 6
4.8. Stress Testing . . . . . . . . . . . . . . . . . . . . . . 8 4.8. Stress Testing . . . . . . . . . . . . . . . . . . . . . . 6
5. Reporting Format . . . . . . . . . . . . . . . . . . . . . . . 8 5. Reporting Format . . . . . . . . . . . . . . . . . . . . . . . 6
5.1. Test Setup Report . . . . . . . . . . . . . . . . . . . . 8 5.1. Test Setup Report . . . . . . . . . . . . . . . . . . . . 6
5.2. Device Benchmarks for IS . . . . . . . . . . . . . . . . . 9 5.2. Device Benchmarks for IS . . . . . . . . . . . . . . . . . 7
5.3. Device Benchmarks for NS . . . . . . . . . . . . . . . . . 9 5.3. Device Benchmarks for NS . . . . . . . . . . . . . . . . . 8
6. Test Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 9 6. Test Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 8
6.1. Baseline Session Establishment Rate . . . . . . . . . . . 9 6.1. Baseline Session Establishment Rate . . . . . . . . . . . 8
6.2. Session Establishment Rate . . . . . . . . . . . . . . . . 10 6.2. Session Establishment Rate . . . . . . . . . . . . . . . . 9
6.3. Session Establishment Rate with Media . . . . . . . . . . 10 6.3. Session Establishment Rate with Media . . . . . . . . . . 9
6.4. Session Establishment Rate with Loop Detection Enabled . . 11 6.4. Session Establishment Rate with Loop Detection Enabled . . 10
6.5. Session Establishment Rate with Forking . . . . . . . . . 12 6.5. Session Establishment Rate with Forking . . . . . . . . . 11
6.6. Session Establishment Rate with Forking and Loop 6.6. Session Establishment Rate with Forking and Loop
Detection . . . . . . . . . . . . . . . . . . . . . . . . 12 Detection . . . . . . . . . . . . . . . . . . . . . . . . 11
6.7. Session Establishment Rate with TLS Encrypted SIP . . . . 13 6.7. Session Establishment Rate with TLS Encrypted SIP . . . . 12
6.8. Session Establishment Rate with IPsec Encrypted SIP . . . 13 6.8. Session Establishment Rate with IPsec Encrypted SIP . . . 13
6.9. Session Establishment Rate with SIP Flooding . . . . . . . 14 6.9. Session Establishment Rate with SIP Flooding . . . . . . . 13
6.10. Maximum Registration Rate . . . . . . . . . . . . . . . . 14 6.10. Maximum Registration Rate . . . . . . . . . . . . . . . . 14
6.11. Maximum Re-Registration Rate . . . . . . . . . . . . . . . 15 6.11. Maximum Re-Registration Rate . . . . . . . . . . . . . . . 14
6.12. Maximum IM Rate . . . . . . . . . . . . . . . . . . . . . 16 6.12. Maximum IM Rate . . . . . . . . . . . . . . . . . . . . . 15
6.13. Session Capacity without Media . . . . . . . . . . . . . . 16 6.13. Session Capacity without Media . . . . . . . . . . . . . . 16
6.14. Session Capacity with Media . . . . . . . . . . . . . . . 17 6.14. Session Capacity with Media . . . . . . . . . . . . . . . 16
6.15. Session Capacity with Media and a Media Relay/NAT 6.15. Session Capacity with Media and a Media Relay/NAT
and/or Firewall . . . . . . . . . . . . . . . . . . . . . 17 and/or Firewall . . . . . . . . . . . . . . . . . . . . . 17
7. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 18 7. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 18
8. Security Considerations . . . . . . . . . . . . . . . . . . . 18 8. Security Considerations . . . . . . . . . . . . . . . . . . . 18
9. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 18 9. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 18
10. References . . . . . . . . . . . . . . . . . . . . . . . . . . 18 10. References . . . . . . . . . . . . . . . . . . . . . . . . . . 18
10.1. Normative References . . . . . . . . . . . . . . . . . . . 18 10.1. Normative References . . . . . . . . . . . . . . . . . . . 18
10.2. Informative References . . . . . . . . . . . . . . . . . . 19 10.2. Informative References . . . . . . . . . . . . . . . . . . 19
Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 19 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 19
skipping to change at page 5, line 11 skipping to change at page 5, line 11
SIP permits a wide range of configuration options that are also SIP permits a wide range of configuration options that are also
explained in the Test Setup section. Benchmark metrics could explained in the Test Setup section. Benchmark metrics could
possibly be impacted by Associated Media. The selected values for possibly be impacted by Associated Media. The selected values for
Session Duration and Media Streams Per Session enable benchmark Session Duration and Media Streams Per Session enable benchmark
metrics to be benchmarked without Associated Media. Session Setup metrics to be benchmarked without Associated Media. Session Setup
Rate could possibly be impacted by the selected value for Maximum Rate could possibly be impacted by the selected value for Maximum
Sessions Attempted. The benchmark for Session Establishment Rate is Sessions Attempted. The benchmark for Session Establishment Rate is
measured with a fixed value for maximum Session Attempts. measured with a fixed value for maximum Session Attempts.
3. Test Topologies 3. Benchmarking Topologies
Figures 1 through 3 below provide various topologies to perform the
SIP Performance Benchmarking. These figures show the Device Under
Test (DUT) to be a single server or a System Under Test (SUT). Test
Topology options to include benchmarking with Associated Media
require use of a SUT and are shown in Figures 3.
DUT
--------- ---------
| | | |
|signal-|
|ing | | |
|server | SIP | |
| |<------------->| EA |
| | | |
| | | |
| | | |
--------- ---------
Figure 1: Basic SIP Test Topology
SUT
------------------------
--------- --------- ---------
| | |media- | | |
|signal-| | relay | | |
|ing | SIP | OR | SIP | |
|server |<---------------------->| EA |
| | |fire | | |
| | |wall OR| | |
| | | NAT | | |
--------- --------- ---------
Figure 2: SIP Test Topology with Firewall
SUT
------------------------
--------- --------- ---------
| | | media-| | |
|signal-| | relay| | |
|ing | SIP | OR | SIP | |
|server |<---------------------->| EA |
| | |fire | | |
| | |wall OR| Media | |
| | ---| NAT |---------| |
--------- | --------- ---------
| Media ^
-------------------------|
Figure 3: SIP Test Topology with Media through Firewall
---------
| |
|------------->| |
^ | |
| SIP | |
|<-------------| EA |
| |
| |
| |
---------
Figure 4: Baseline Test Topology Familiarity with the benchmarking models in Section 2.2 of
[I-D.sip-bench-term] is assumed. Figures 1 through 9 in
[I-D.sip-bench-term] contain the canonical topologies that can be
used to perform the benchmarking tests listed in this document.
4. Test Considerations 4. Benchmarking Considerations
4.1. Selection of SIP Transport Protocol 4.1. Selection of SIP Transport Protocol
Test cases may be performed with any transport protocol supported Test cases may be performed with any transport protocol supported
by SIP. This includes, but is not limited to, SIP TCP, SIP UDP, by SIP. This includes, but is not limited to, SIP TCP, SIP UDP,
and TLS. The protocol used for the SIP transport protocol must be and TLS. The protocol used for the SIP transport protocol must be
reported with benchmarking results. reported with benchmarking results.
4.2. Signaling Server 4.2. Signaling Server
The Server is a SIP-speaking device that complies with RFC 3261. The Server is a SIP-speaking device that complies with RFC 3261.
The purpose of this document is to benchmark SIP performance, not The purpose of this document is to benchmark SIP performance, not
conformance. Conformance to [RFC3261] is assumed for all tests. conformance. Conformance to [RFC3261] is assumed for all tests.
The Server may be the DUT or a component of a SUT that includes The Server may be the DUT or a component of a SUT that includes
Firewall and/or NAT functionality. The components of the SUT may Firewall and/or NAT functionality. The components of the SUT may
be a single physical device or separate devices. be a single physical device or separate devices.
4.3. Associated Media 4.3. Associated Media
Some tests may require associated media to be present for each SIP Some tests may require associated media to be present for each SIP
skipping to change at page 9, line 35 skipping to change at page 8, line 31
IM Rate = _______________________________ (IM messages per second) IM Rate = _______________________________ (IM messages per second)
6. Test Cases 6. Test Cases
6.1. Baseline Session Establishment Rate 6.1. Baseline Session Establishment Rate
Objective: Objective:
To benchmark the Session Establishment Rate of the DUT/SUT with To benchmark the Session Establishment Rate of the DUT/SUT with
zero failures. zero failures.
Procedure: Procedure:
1. Configure the test bed in the test topology shown in Figure 4. 1. Configure the DUT in the test topology shown in Figure 1 or
Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester with a Session Attempt Rate = 100 SPS, 2. Configure Tester with a Session Attempt Rate = 100 SPS,
maximum Session Attempts = 100,000 and Media Streams Per maximum Session Attempts = 100,000 and Media Streams Per
Session=0. Session=0.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
4. Measure Session Attempt Failures and total Established 4. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
5. If a Session Attempt Failure is recorded then reduce the 5. If a Session Attempt Failure is recorded then reduce the
Session Attempt Rate configured on the Tester by 50%. Session Attempt Rate configured on the Tester by 50%.
6. If no Session Attempt Failure is recorded then increase the 6. If no Session Attempt Failure is recorded then increase the
skipping to change at page 10, line 20 skipping to change at page 9, line 20
performed on different test beds or simply to better understand performed on different test beds or simply to better understand
the impact of the DUT/SUT on the test bed in question. the impact of the DUT/SUT on the test bed in question.
6.2. Session Establishment Rate 6.2. Session Establishment Rate
Objective: Objective:
To benchmark the Session Establishment Rate of the DUT/SUT with To benchmark the Session Establishment Rate of the DUT/SUT with
zero failures. zero failures.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester with a Session Attempt Rate = 100 SPS, 2. Configure Tester with a Session Attempt Rate = 100 SPS,
maximum Session Attempts = 100,000 and Media Streams Per maximum Session Attempts = 100,000 and Media Streams Per
Session=0. Session=0.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
4. Measure Session Attempt Failures and total Established 4. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
5. If a Session Attempt Failure is recorded then reduce the 5. If a Session Attempt Failure is recorded then reduce the
Session Attempt Rate configured on the Tester by 50%. Session Attempt Rate configured on the Tester by 50%.
6. If no Session Attempt Failure is recorded then increase the 6. If no Session Attempt Failure is recorded then increase the
skipping to change at page 10, line 43 skipping to change at page 9, line 46
is obtained and recorded. is obtained and recorded.
Expected Results: This is the scenario to obtain the maximum Session Expected Results: This is the scenario to obtain the maximum Session
Establishment Rate of the DUT/SUT. Establishment Rate of the DUT/SUT.
6.3. Session Establishment Rate with Media 6.3. Session Establishment Rate with Media
Objective: Objective:
To benchmark the Session Establishment Rate of the SUT with zero To benchmark the Session Establishment Rate of the SUT with zero
failures when Associated Media is included in the benchmark test. failures when Associated Media is included in the benchmark test.
Procedure: Procedure:
1. Configure the SUT in the test topology shown in Figure 2 or 3. 1. Configure the SUT as shown in Figure 6 or Figure 9 in
[I-D.sip-bench-term].
2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum
Session Attempts = 100,000 and Media Streams Per Session = 1. Session Attempts = 100,000 and Media Streams Per Session = 1.
The rate of offered load for each media stream SHOULD be (eq The rate of offered load for each media stream SHOULD be (eq
1) Offered Load per Media Stream = Throughput / maximum 1) Offered Load per Media Stream = Throughput / maximum
sessions attempted, where Throughput is defined in [RFC2544]. sessions attempted, where Throughput is defined in [RFC2544].
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
SUT and transmit media through the SUT to a destination other SUT and transmit media through the SUT to a destination other
than the server. than the server.
4. At the Tester measure Session Attempt Failures, total 4. At the Tester measure Session Attempt Failures, total
Established Sessions, and Packet Loss [RFC2544] of the media. Established Sessions, and Packet Loss [RFC2544] of the media.
5. If a Session Attempt Failure or Packet Loss is recorded then 5. If a Session Attempt Failure or Packet Loss is recorded then
reduce the Session Attempt Rate configured on the Tester by reduce the Session Attempt Rate configured on the Tester by
50%. 50%.
6. If no Session Attempt Failure or Packet Loss is recorded then 6. If no Session Attempt Failure or Packet Loss is recorded then
increase the Session Attempt Rate configured on the Tester by increase the Session Attempt Rate configured on the Tester by
50%. 50%.
7. Repeat steps 3 through 6 until the Session Establishment Rate 7. Repeat steps 3 through 6 until the Session Establishment Rate
is obtained and recorded. is obtained and recorded.
skipping to change at page 11, line 34 skipping to change at page 10, line 37
those obtained without media in the case where the server and the those obtained without media in the case where the server and the
NAT, Firewall or Media Relay are running on the same platform. NAT, Firewall or Media Relay are running on the same platform.
6.4. Session Establishment Rate with Loop Detection Enabled 6.4. Session Establishment Rate with Loop Detection Enabled
Objective: Objective:
To benchmark the Session Establishment Rate of the DUT/SUT with To benchmark the Session Establishment Rate of the DUT/SUT with
zero failures when the Loop Detection option is enabled. zero failures when the Loop Detection option is enabled.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum
Session Attempts = 100,000 and Media Streams Per Session=0. Session Attempts = 100,000 and Media Streams Per Session=0.
3. Turn on the Loop Detection option in the DUT or SUT. 3. Turn on the Loop Detection option in the DUT or SUT.
4. Start Tester to initiate SIP Session establishment with the 4. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
5. Measure Session Attempt Failures and total Established 5. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
6. If a Session Attempt Failure is recorded then reduce the 6. If a Session Attempt Failure is recorded then reduce the
Session Attempt Rate configured on the Tester by 50%. Session Attempt Rate configured on the Tester by 50%.
7. If no Session Attempt Failure is recorded then increase the 7. If no Session Attempt Failure is recorded then increase the
skipping to change at page 12, line 12 skipping to change at page 11, line 18
Loop Detection may be lower than those obtained without Loop Loop Detection may be lower than those obtained without Loop
Detection enabled. Detection enabled.
6.5. Session Establishment Rate with Forking 6.5. Session Establishment Rate with Forking
Objective: Objective:
To benchmark the Session Establishment Rate of the DUT/SUT with To benchmark the Session Establishment Rate of the DUT/SUT with
zero failures when the Forking Option is enabled. zero failures when the Forking Option is enabled.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum
Session Attempts = 100,000 and Media Streams Per Session=0. Session Attempts = 100,000 and Media Streams Per Session=0.
3. Set the number of endpoints that will receive the forked 3. Set the number of endpoints that will receive the forked
invitation to a value of 2 or more (subsequent tests may invitation to a value of 2 or more (subsequent tests may
increase this value at the discretion of the tester.) increase this value at the discretion of the tester.)
4. Start Tester to initiate SIP Session establishment with the 4. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
5. Measure Session Attempt Failures and total Established 5. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
6. If a Session Attempt Failure is recorded then reduce the 6. If a Session Attempt Failure is recorded then reduce the
skipping to change at page 12, line 39 skipping to change at page 11, line 48
Forking may be lower than those obtained without Forking enabled. Forking may be lower than those obtained without Forking enabled.
6.6. Session Establishment Rate with Forking and Loop Detection 6.6. Session Establishment Rate with Forking and Loop Detection
Objective: Objective:
To benchmark the Session Establishment Rate of the DUT/SUT with To benchmark the Session Establishment Rate of the DUT/SUT with
zero failures when both the Forking and Loop Detection Options are zero failures when both the Forking and Loop Detection Options are
enabled. enabled.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum 2. Configure Tester for a Session Attempt Rate = 100 SPS, maximum
Session Attempts = 100,000 and Media Streams Per Session=0. Session Attempts = 100,000 and Media Streams Per Session=0.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
4. Enable the Loop Detection Options on the DUT. 4. Enable the Loop Detection Options on the DUT.
5. Set the number of endpoints that will receive the forked 5. Set the number of endpoints that will receive the forked
invitation to a value of 2 or more (subsequent tests may invitation to a value of 2 or more (subsequent tests may
increase this value at the discretion of the tester.) increase this value at the discretion of the tester.)
6. Measure Session Attempt Failures and total Established 6. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
skipping to change at page 13, line 20 skipping to change at page 12, line 32
Forking and Loop Detection may be lower than those obtained with Forking and Loop Detection may be lower than those obtained with
only Forking or Loop Detection enabled. only Forking or Loop Detection enabled.
6.7. Session Establishment Rate with TLS Encrypted SIP 6.7. Session Establishment Rate with TLS Encrypted SIP
Objective: Objective:
To benchmark the Session Establishment Rate of the DUT/SUT with To benchmark the Session Establishment Rate of the DUT/SUT with
zero failures when using TLS encrypted SIP. zero failures when using TLS encrypted SIP.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for SIP TCP, enable TLS, Session Attempt Rate 2. Configure Tester for SIP TCP, enable TLS, Session Attempt Rate
= 100 SPS, maximum Session Attempts = 100,000 and Media = 100 SPS, maximum Session Attempts = 100,000 and Media
Streams Per Session = 0. Streams Per Session = 0.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
4. Measure Session Attempt Failures and total Established 4. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
5. If a Session Attempt Failure is recorded then reduce the 5. If a Session Attempt Failure is recorded then reduce the
Session Attempt Rate configured on the Tester by 50%. Session Attempt Rate configured on the Tester by 50%.
6. If no Session Attempt Failure is recorded then increase the 6. If no Session Attempt Failure is recorded then increase the
skipping to change at page 13, line 45 skipping to change at page 13, line 16
TLS Encrypted SIP may be lower than those obtained with plaintext TLS Encrypted SIP may be lower than those obtained with plaintext
SIP. SIP.
6.8. Session Establishment Rate with IPsec Encrypted SIP 6.8. Session Establishment Rate with IPsec Encrypted SIP
Objective: Objective:
To benchmark the Session Establishment Rate of the DUT/SUT with To benchmark the Session Establishment Rate of the DUT/SUT with
zero failures when using IPsec Encryoted SIP. zero failures when using IPsec Encryoted SIP.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for SIP TCP, enable IPSec, Session Attempt 2. Configure Tester for SIP TCP, enable IPSec, Session Attempt
Rate = 100 SPS, maximum Session Attempts = 100,000 and Media Rate = 100 SPS, maximum Session Attempts = 100,000 and Media
Streams Per Session = 0. Streams Per Session = 0.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
4. Measure Session Attempt Failures and total Established 4. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
5. If a Session Attempt Failure is recorded then reduce the 5. If a Session Attempt Failure is recorded then reduce the
Session Attempt Rate configured on the Tester by 50%. Session Attempt Rate configured on the Tester by 50%.
6. If no Session Attempt Failure is recorded then increase the 6. If no Session Attempt Failure is recorded then increase the
Session Attempt Rate configured on the Tester by 50%. Session Attempt Rate configured on the Tester by 50%.
7. Repeat steps 3 through 6 until the Session Establishment Rate 7. Repeat steps 3 through 6 until the Session Establishment Rate
is obtained and recorded. is obtained and recorded.
Expected Results: Session Establishment Rate results obtained with Expected Results: Session Establishment Rate results obtained with
IPSec Encrypted SIP may be lower than those obtained with IPSec Encrypted SIP may be lower than those obtained with
skipping to change at page 14, line 24 skipping to change at page 13, line 44
IPSec Encrypted SIP may be lower than those obtained with IPSec Encrypted SIP may be lower than those obtained with
plaintext SIP. plaintext SIP.
6.9. Session Establishment Rate with SIP Flooding 6.9. Session Establishment Rate with SIP Flooding
Objective: Objective:
To benchmark the Session Establishment Rate of the SUT with zero To benchmark the Session Establishment Rate of the SUT with zero
failures when SIP Flooding is occurring. failures when SIP Flooding is occurring.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
the SUT as shown in Figure 2. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for SIP UDP with an Session Attempt Rate = 2. Configure Tester for SIP UDP with an Session Attempt Rate =
100 SPS, maximum Session Attempts = 100,000, Associated Media 100 SPS, maximum Session Attempts = 100,000, Associated Media
Streams Per Session = 0, and SIP INVITE Message Flood = 500 Streams Per Session = 0, and SIP INVITE Message Flood = 500
per second. per second.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
SUT and SIP Flood targetted at the Server. SUT and SIP Flood targetted at the Server.
4. At the Tester measure Session Attempt Failures, total 4. At the Tester measure Session Attempt Failures, total
Established Sessions, and Packet Loss [RFC2544] of the media. Established Sessions, and Packet Loss [RFC2544] of the media.
5. If a Session Attempt Failure or Packet Loss is recorded then 5. If a Session Attempt Failure or Packet Loss is recorded then
reduce the Session Attempt Rate configured on the Tester by reduce the Session Attempt Rate configured on the Tester by
50%. 50%.
6. If no Session Attempt Failure or Packet Loss is recorded then 6. If no Session Attempt Failure or Packet Loss is recorded then
increase the Session Attempt Rate configured on the Tester by increase the Session Attempt Rate configured on the Tester by
50%. 50%.
skipping to change at page 15, line 4 skipping to change at page 14, line 25
7. Repeat steps 3 through 6 until the Session Establishment Rate 7. Repeat steps 3 through 6 until the Session Establishment Rate
is obtained and recorded. is obtained and recorded.
8. Repeat steps 1 through 7 with SIP INVITE Message Flood = 1000 8. Repeat steps 1 through 7 with SIP INVITE Message Flood = 1000
per second. per second.
Expected Results: Session Establishment Rate results obtained with Expected Results: Session Establishment Rate results obtained with
SIP Flooding may be degraded. SIP Flooding may be degraded.
6.10. Maximum Registration Rate 6.10. Maximum Registration Rate
Objective: Objective:
To benchmark the maximum registration rate of the DUT/SUT with To benchmark the maximum registration rate of the DUT/SUT with
zero failures. zero failures.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester with a Registration Rate = 100 SPS and 2. Configure Tester with a Registration Rate = 100 SPS and
maximum registrations attempted = 100,000. maximum registrations attempted = 100,000.
3. Set the registration timeout value to at least 3600 seconds. 3. Set the registration timeout value to at least 3600 seconds.
4. At the Tester measure failed registration attempts, total 4. At the Tester measure failed registration attempts, total
registrations and packet loss. registrations and packet loss.
5. If a Failed Registration Attempt or Packet Loss is recorded 5. If a Failed Registration Attempt or Packet Loss is recorded
then reduce the Attempted Registration Rate configured on the then reduce the Attempted Registration Rate configured on the
Tester by 50%. Tester by 50%.
6. If no Failed Registration or Packet Loss is recorded then 6. If no Failed Registration or Packet Loss is recorded then
increase the Attempted Registration Rate configured on the increase the Attempted Registration Rate configured on the
skipping to change at page 15, line 26 skipping to change at page 15, line 4
then reduce the Attempted Registration Rate configured on the then reduce the Attempted Registration Rate configured on the
Tester by 50%. Tester by 50%.
6. If no Failed Registration or Packet Loss is recorded then 6. If no Failed Registration or Packet Loss is recorded then
increase the Attempted Registration Rate configured on the increase the Attempted Registration Rate configured on the
Tester by 50%. Tester by 50%.
7. Repeat steps 5 and 6 until the all registrations have 7. Repeat steps 5 and 6 until the all registrations have
succeeded. This number is obtained and recorded. succeeded. This number is obtained and recorded.
Expected Results: Expected Results:
6.11. Maximum Re-Registration Rate 6.11. Maximum Re-Registration Rate
Objective: Objective:
To benchmark the maximum re-registration rate of the DUT/SUT with To benchmark the maximum re-registration rate of the DUT/SUT with
zero failures. zero failures.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Execute test detailed in Section 6.10 to register the 2. Execute test detailed in Section 6.10 to register the
endpoints with the registrar. The rest of the steps below endpoints with the registrar. The rest of the steps below
MUST be performed at least 5 minutes after, but no more than MUST be performed at least 5 minutes after, but no more than
15 minutes after the test performed in Section 6.10. 15 minutes after the test performed in Section 6.10.
3. Configure Tester for an attempted Registration Rate = 100 SPS 3. Configure Tester for an attempted Registration Rate = 100 SPS
and maximum registrations attempted = 100,000. and maximum registrations attempted = 100,000.
4. Configure Tester to re-register the same address-of-records 4. Configure Tester to re-register the same address-of-records
that were registered in Section 6.10. that were registered in Section 6.10.
5. At the Tester measure failed registration attempts, total 5. At the Tester measure failed registration attempts, total
registrations and packet loss. registrations and packet loss.
skipping to change at page 16, line 4 skipping to change at page 15, line 31
5. At the Tester measure failed registration attempts, total 5. At the Tester measure failed registration attempts, total
registrations and packet loss. registrations and packet loss.
6. If a Failed Registration Attempt or Packet Loss is recorded 6. If a Failed Registration Attempt or Packet Loss is recorded
then reduce the Attempted Registration Rate configured on the then reduce the Attempted Registration Rate configured on the
Tester by 50%. Tester by 50%.
7. If no Failed Registration or Packet Loss is recorded then 7. If no Failed Registration or Packet Loss is recorded then
increase the Attempted Registration Rate configured on the increase the Attempted Registration Rate configured on the
Tester by 50%. Tester by 50%.
8. Repeat steps 6 and 7 until the all re-registrations have 8. Repeat steps 6 and 7 until the all re-registrations have
succeeded. This number is obtained and recorded. succeeded. This number is obtained and recorded.
Expected Results: The rate should be at least equal to but not more Expected Results: The rate should be at least equal to but not more
than the result of Section 6.10. than the result of Section 6.10.
6.12. Maximum IM Rate 6.12. Maximum IM Rate
Objective: Objective:
To benchmark the maximum IM rate of the SUT with zero failures. To benchmark the maximum IM rate of the SUT with zero failures.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for an IM Rate = 100 SPS, Maximum IM 2. Configure Tester for an IM Rate = 100 SPS, Maximum IM
Attempted = 100,000. Attempted = 100,000.
3. At the Tester measure Failed IM Attempts, Total IM and Packet 3. At the Tester measure Failed IM Attempts, Total IM and Packet
Loss. Loss.
4. If a Failed IM Attempt or Packet Loss is recorded then reduce 4. If a Failed IM Attempt or Packet Loss is recorded then reduce
the Attempted IM Rate configured on the Tester by 50%. the Attempted IM Rate configured on the Tester by 50%.
5. If no Failed IM or Packet Loss is recorded then increase the 5. If no Failed IM or Packet Loss is recorded then increase the
Attempted IM Rate configured on the Tester by 50%. Attempted IM Rate configured on the Tester by 50%.
6. Repeat steps 3 through 6 until the Maximum IM Rate is obtained 6. Repeat steps 3 through 6 until the Maximum IM Rate is obtained
and recorded. and recorded.
Expected Results: Expected Results:
6.13. Session Capacity without Media 6.13. Session Capacity without Media
Objective: Objective:
To benchmark the Session Capacity of the SUT without Associated To benchmark the Session Capacity of the SUT without Associated
skipping to change at page 16, line 34 skipping to change at page 16, line 18
and recorded. and recorded.
Expected Results: Expected Results:
6.13. Session Capacity without Media 6.13. Session Capacity without Media
Objective: Objective:
To benchmark the Session Capacity of the SUT without Associated To benchmark the Session Capacity of the SUT without Associated
Media. Media.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 1 or
SUT as shown in Figures 2 or 3. Figure 2 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 5 or Figure 6 in [I-D.sip-bench-term].
2. Configure Tester for a Session Attempt Rate = Session 2. Configure Tester for a Session Attempt Rate = Session
Establishment Rate, maximum Session Attempts = 10,000 and Establishment Rate, maximum Session Attempts = 10,000 and
Media Streams Per Session = 0. Media Streams Per Session = 0.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
DUT. DUT.
4. Measure Session Attempt Failures, total Established Sessions, 4. Measure Session Attempt Failures, total Established Sessions,
and Packet Loss [RFC2544] at the Tester. and Packet Loss [RFC2544] at the Tester.
5. If a Session Attempt Failure or Packet Loss is recorded then 5. If a Session Attempt Failure or Packet Loss is recorded then
reduce the maximum Session Attempts configured on the Tester reduce the maximum Session Attempts configured on the Tester
by 5,000. by 5,000.
skipping to change at page 17, line 4 skipping to change at page 16, line 39
5. If a Session Attempt Failure or Packet Loss is recorded then 5. If a Session Attempt Failure or Packet Loss is recorded then
reduce the maximum Session Attempts configured on the Tester reduce the maximum Session Attempts configured on the Tester
by 5,000. by 5,000.
6. If no Session Attempt Failure or Packet Loss is recorded then 6. If no Session Attempt Failure or Packet Loss is recorded then
increase the maximum Session Attempts configured on the Tester increase the maximum Session Attempts configured on the Tester
by 10,000. by 10,000.
7. Repeat steps 3 through 6 until the Session Capacity is 7. Repeat steps 3 through 6 until the Session Capacity is
obtained and recorded. obtained and recorded.
8. Repeat steps 1 through 7 for multimedia in which media streams 8. Repeat steps 1 through 7 for multimedia in which media streams
per session = 2. per session = 2.
Expected Results: This is the scenario to obtain the maximum Session Expected Results: This is the scenario to obtain the maximum Session
Capacity of the DUT/SUT. Capacity of the DUT/SUT.
6.14. Session Capacity with Media 6.14. Session Capacity with Media
Objective: Objective:
To benchmark the session capacity of the DUT/SUT with Associated To benchmark the session capacity of the DUT/SUT with Associated
Media. Media.
Procedure: Procedure:
1. Configure the DUT in the test topology shown in Figure 1 or 1. Configure the DUT in the test topology shown in Figure 3 or
SUT as shown in Figures 2 or 3. Figure 4 in [I-D.sip-bench-term] depending on whether the DUT
is being benchmarked as a user agent client or user agent
server. If a SUT is being benchmarked, configure the SUT as
shown in Figure 9 of [I-D.sip-bench-term].
2. Configure Tester for a Session Attempt Rate = 100 SPS, Session 2. Configure Tester for a Session Attempt Rate = 100 SPS, Session
Duration = 30 sec, maximum Session Attempts = 100,000 and Duration = 30 sec, maximum Session Attempts = 100,000 and
Media Streams Per Session = 1. Media Streams Per Session = 1.
NOTE: The total offered load to the DUT/SUT SHOULD be equal NOTE: The total offered load to the DUT/SUT SHOULD be equal
to the Throughput of the DUT/SUT as defined in [RFC2544]. to the Throughput of the DUT/SUT as defined in [RFC2544].
The offered load to the DUT/SUT for each media stream The offered load to the DUT/SUT for each media stream
SHOULD be equal to SHOULD be equal to
Throughput/Maximum Session Attemps. Throughput/Maximum Session Attemps.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
SUT and transmit media through the SUT to a destination other SUT and transmit media through the SUT to a destination other
skipping to change at page 17, line 45 skipping to change at page 17, line 34
Expected Results: Session Capacity results obtained with Associated Expected Results: Session Capacity results obtained with Associated
Media with any number of media streams per SIP session will be Media with any number of media streams per SIP session will be
identical to the Session Capacity results obtained without media. identical to the Session Capacity results obtained without media.
6.15. Session Capacity with Media and a Media Relay/NAT and/or Firewall 6.15. Session Capacity with Media and a Media Relay/NAT and/or Firewall
Objective: Objective:
To benchmark the Session Establishment Rate of the SUT with To benchmark the Session Establishment Rate of the SUT with
Associated Media. Associated Media.
Procedure: Procedure:
1. Configure the SUT in the test topology shown in Figure 3. 1. Configure the SUT as shown in Figure 6 or Figure 9 in
[I-D.sip-bench-term].
2. Configure Tester for a Session Attempt Rate = 100 SPS, Session 2. Configure Tester for a Session Attempt Rate = 100 SPS, Session
Duration = 30 sec, maximum Session Attempts = 100,000 and Duration = 30 sec, maximum Session Attempts = 100,000 and
Media Streams Per Session = 1. Media Streams Per Session = 1.
NOTE: The offered load for each media stream SHOULD be as NOTE: The offered load for each media stream SHOULD be as
in Equation 1. in Equation 1.
3. Start Tester to initiate SIP Session establishment with the 3. Start Tester to initiate SIP Session establishment with the
SUT and transmit media through the SUT to a destination other SUT and transmit media through the SUT to a destination other
than the server. than the server.
4. Measure Session Attempt Failures and total Established 4. Measure Session Attempt Failures and total Established
Sessions at the Tester. Sessions at the Tester.
5. If a Session Attempt Failure is recorded then reduce the 5. If a Session Attempt Failure is recorded then reduce the
maximum Session Attempts configured on the Tester by 5,000. maximum Session Attempts configured on the Tester by 5,000.
6. If no Session Attempt Failure is recorded then increase the 6. If no Session Attempt Failure is recorded then increase the
maximum Session Attempts configured on the Tester by 10,000. maximum Session Attempts configured on the Tester by 10,000.
7. Repeat steps 3 through 6 until the Session Capacity is 7. Repeat steps 3 through 6 until the Session Capacity is
skipping to change at page 19, line 7 skipping to change at page 18, line 41
10.1. Normative References 10.1. Normative References
[RFC2119] Bradner, S., "Key words for use in RFCs to Indicate [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate
Requirement Levels", BCP 14, RFC 2119, March 1997. Requirement Levels", BCP 14, RFC 2119, March 1997.
[RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for [RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for
Network Interconnect Devices", RFC 2544, March 1999. Network Interconnect Devices", RFC 2544, March 1999.
[I-D.sip-bench-term] [I-D.sip-bench-term]
Poretsky, S., Gurbani, V., and C. Davids, "SIP Performance Davids, C., Gurbani, V., and S. Poretsky, "SIP Performance
Benchmarking Terminology", Benchmarking Terminology",
draft-ietf-bmwg-sip-bench-term-02 (work in progress), draft-ietf-bmwg-sip-bench-term-03 (work in progress),
July 2010. March 2011.
10.2. Informative References 10.2. Informative References
[RFC3261] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, [RFC3261] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston,
A., Peterson, J., Sparks, R., Handley, M., and E. A., Peterson, J., Sparks, R., Handley, M., and E.
Schooler, "SIP: Session Initiation Protocol", RFC 3261, Schooler, "SIP: Session Initiation Protocol", RFC 3261,
June 2002. June 2002.
Authors' Addresses Authors' Addresses
Scott Poretsky Carol Davids
Allot Communications Illinois Institute of Technology
300 TradeCenter, Suite 4680 201 East Loop Road
Woburn, MA 08101 Wheaton, IL 60187
USA USA
Phone: +1 508 309 2179 Phone: +1 630 682 6024
Email: sporetsky@allot.com Email: davids@iit.edu
Vijay K. Gurbani Vijay K. Gurbani
Bell Laboratories, Alcatel-Lucent Bell Laboratories, Alcatel-Lucent
1960 Lucent Lane 1960 Lucent Lane
Rm 9C-533 Rm 9C-533
Naperville, IL 60566 Naperville, IL 60566
USA USA
Phone: +1 630 224 0216 Phone: +1 630 224 0216
Email: vkg@bell-labs.com Email: vkg@bell-labs.com
Carol Davids Scott Poretsky
Illinois Institute of Technology Allot Communications
201 East Loop Road 300 TradeCenter, Suite 4680
Wheaton, IL 60187 Woburn, MA 08101
USA USA
Phone: +1 630 682 6024 Phone: +1 508 309 2179
Email: davids@iit.edu Email: sporetsky@allot.com
 End of changes. 48 change blocks. 
145 lines changed or deleted 123 lines changed or added

This html diff was produced by rfcdiff 1.41. The latest version is available from http://tools.ietf.org/tools/rfcdiff/