[Ns-bugs] [Bug 737] Backoff procedure is not invoked when transmission is deferred
code at nsnam.ece.gatech.edu
Fri Feb 26 01:34:06 PST 2010
--- Comment #11 from Mathieu Lacage <mathieu.lacage at sophia.inria.fr> 2010-02-26 04:34:05 EDT ---
(In reply to comment #10)
> I'd like to draw your attention to this bug, since it appears to be critical
> in multihop mesh/manet networks. The reason is that none of our models really
> account for processing delays and all re-transmissions (say forwarding RREQ in
> AODV) occur simultaneously. Large number of exactly simultaneous transmissions
> leads to significantly overestimated collision probability and even to wrong
> protocol operation. Recent example of this behavior was reported to me recently
> by Kuba Wierusz.
Thanks a lot for the detailed diagram by kiril, I do understand this problem
better now. The key issue I was confused about was the term "retransmission" in
the context of the MAC. For me, it meant retransmission attempt after a failed
transmission. For kirill and you, it means, a MAC-level forwarding attempt.
> Fixing bug 737 gives us simple workaround for this problem without the need
> of explicit accounting for processing delay. Indeed, now when wifi device is
> asked to retransmit a packet without any delay it will deffer this for DIFS.
> After proposed bugfix every deferred TX will start backoff -- exactly what is
> needed to avoid artificial collisions.
My initial reaction to this proposed solution is that it is wrong: we should
not unconditionally start a backoff after a packet reception: it goes against
both the spirit and the letter of the 802.11 spec.
It seems to me that the problem is not that we need to model processing delays:
we need to model non-deterministic _varying_ processing delays which change
from one station to another, and, potentially, from one packet to another
within the same station, right ? If so, I would support adding a delay in
MacLow when we receive a packet before forwarding it to the upper layers and
making that delay be picked from a RandomVariable with a default value of being
a gaussian distribution centered around 10us with a non-zero value for the
variance. I will ask a collegue what a decent value would be for the
mean/variance to model some PC-class hardware.
Configure bugmail: http://www.nsnam.org/bugzilla/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.
More information about the Ns-bugs