Different OpenWrt Different speed

Hello,

I’m experiencing a very slow response in the execution of a python script.
The fact is that I have two Yun, one with the original OpenWrt and the second one with ChaosCalmer(by RedSnake64).

The same python script is executed in both the Yuns and while in the first one the execution takes around 3/4 seconds, in the second Arduino it takes 15/16 seconds!! That is way too much for my requirements.

Here I report the script so you can tell me what you think the problem could be:

#!/usr/bin/env python -u

from struct import *
import socket
from scapy.packet import *
from scapy.fields import *
from scapy.layers.inet import *
import time
import subprocess
import sys

myip = '192.168.xxx.xxx'
serverip = '192.168.xxx.xxx'
TCP_PORT = 5005
BUFFER_SIZE = 1024


#Custom Scapy Protocols
class X(Packet):
	name = "xxx"
	fields_desc=[ IntField("x", 0),
			IntField("xx", 0),
			IntField("xxx", 0),
			SignedIntField("xxxx", 0)]

class XX(Packet):
	name = "xx"
	fields_desc= [ IntField("x", 0),
        IntField("xx", 0),
	IntField("xxx", 0),
        IntField("xxxx", 0),
        IntField("xxxxx", 0),
        IntField("xxxxxx", 0),
        IntField("xxxxxxx", 0),
        IntField("xxxxxxx", 0),
        IEEEFloatField("xxxxxxxx",0),
        IEEEDoubleField("ssss",0),
        IEEEDoubleField("sssss",0)]

#Build The Reservation Packet
pkt = IP(len=16384, src=myip, dst=serverip,
id=RandShort(), ttl=64)/TCP(sport=5005,
dport=5005, flags="S", window=200,
options=[('MSS', 1460), ('WScale', 2)])/X(x=int(sys.argv[1]), x=int(sys.argv[2]), x=10)

t1 = pkt.time
#pkt.show()
spkt = str(pkt)

s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((serverip, TCP_PORT))
s.send(spkt)

s1 = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s1.bind((myip, 5000))
bind_layers(TCP, XX, dport=5000)

as=[]
t4 = 0
ok = 0
while True:

	data,addr = s1.recvfrom(BUFFER_SIZE)
	container = IP(data)
	#print container.time
	if(container.getlayer(XX).x != 3): 
		as.append(container.getlayer(XX).xxxxxxx)
		t4 = container.time
	else:
		
		t2 = container.getlayer(XX).xxxxx
		t3 = container.getlayer(XX).xxxx
		
		delta = ((t2-t1)+(t4-t3))/2
		ok = container.getlayer(XX).xxx - delta
		if(offset<0):
			ok_n = 10 - ok
		else:
			ok_n = ok 
		break
print as
print ok

@mridolfi what did you want to do? Do you want both to run at the same speed? Did you check the processor speed? Is there different software on either?

Did you know that this IP address (192.168.xxx.xxx) is not routable?

Jesse

I’d like to understand why it is so slow with the new OS.

How can I check the processor speed?

mridolfi: I'd like to understand why it is so slow with the new OS.

How can I check the processor speed?

Is there a goal beyond knowing the different speeds? (This is a valid goal. Just trying to understand your needs.)

Jesse

Yes!

I mean, I’m designing a system where the Yun sends a message to a server and then wait for the reply. But if it takes too long to send this message I can’t use the ChaosCalmer OpenWrt. The messages exchange must be as fast as possible.

From the moment I launch the script, the packet should be sent out almost immediately, not after 15 seconds, that is too much.

mridolfi: Yes!

I mean, I'm designing a system where the Yun sends a message to a server and then wait for the reply. But if it takes too long to send this message I can't use the ChaosCalmer OpenWrt. The messages exchange must be as fast as possible.

From the moment I launch the script, the packet should be sent out almost immediately, not after 15 seconds, that is too much.

@mridolfi Okay. This is still not enough information but I can tell you one thing - Linux is not a real-time system. So your requirement that

the packet should be sent out almost immediately, not after 15 seconds, that is too much.

is not going to work. Under 99.9% of all conditions, you can get the message within a few milliseconds. However, every so often - for reasons I cannot give you (or know) you will be stretched beyond 15 seconds.

So going back to the original post, you have two different processors and slightly different clocks on each. As such, they are going to differ from time to time. What else can I say. Again, Linux is not a real-time system.

Jesse

Please try to precompile your python modules with:

python -m compileall
python -m compileall /usr/lib/python2.7/bridge

This should speed up python a little bit.