Can a server determine if the client is on LAN or WLAN?

I’d like to restrict the access to my server to clients who connect to their home router by LAN rather than WLAN. Is this at all possible, e.g. by examining packets?

Further information: I’m concerned about timing rather than security. The server runs a realtime audio application, and WLAN (generally speaking) introduces much more jitter than LAN does. So, my quesion rephrased, is:

Can I determine the amount of jitter by looking at the amount of UDP packet loss or out-of-sequence arrivals? What are some commonly accepted tools to achieve this?


Nope.

to their home router

means you have zero insight what happens behind the router. Case closed. There is nothing you can do from the server side to get information not provided by the network.


If you concerned about RTT and jitter, measure exactly these. Don't segregate users by their technology. If their WiFi or PLC or whatever they use is good enough to have acceptable RTT and low jitter, why discarding them only because they use WiFi or PLC or whatever?

On the contrary, if their super wired connection shows 1 sec of RTT, this doesn't mean they are on WiFi, they just may have a torrent running somewhere nearby and the delay may be caused by traffic shaping with large buffers on the ISP hardware, not by their physical connection.

Every clever enough real time application I've seen until now measures and usually reports this kind of information, and no one tries to detect if I'm on WiFi or not.

And no, while a software running on a computer can say if it's running currently over WiFi or wired, the "wired" verdict isn't final. ISP may have connected their building with WiMax; there may be WiFi bridges inside network; and so on. This is indetectable even for the computer inside the network using them, and of course you have no way to find that out.