OpenCV live stream video over socket in Python 3
Solution 1:
I'm the author of VidGear Video Processing python library that now also provides NetGear API, which is exclusively designed to transfer video frames synchronously between interconnecting systems over the network in real-time. You try it as follows:
A. Server End:(Bare-Minimum example)
Open your favorite terminal and execute the following python code:
Note: You can end streaming anytime on both server and client side by pressing [Ctrl+c] on your keyboard on server end!
# import libraries
from vidgear.gears import VideoGear
from vidgear.gears import NetGear
stream = VideoGear(source='test.mp4').start() #Open any video stream
server = NetGear() #Define netgear server with default settings
# infinite loop until [Ctrl+C] is pressed
while True:
try:
frame = stream.read()
# read frames
# check if frame is None
if frame is None:
#if True break the infinite loop
break
# do something with frame here
# send frame to server
server.send(frame)
except KeyboardInterrupt:
#break the infinite loop
break
# safely close video stream
stream.stop()
# safely close server
writer.close()
B. Client End:(Bare-Minimum example)
Then open another terminal on the same system and execute the following python code and see the output:
# import libraries
from vidgear.gears import NetGear
import cv2
#define netgear client with `receive_mode = True` and default settings
client = NetGear(receive_mode = True)
# infinite loop
while True:
# receive frames from network
frame = client.recv()
# check if frame is None
if frame is None:
#if True break the infinite loop
break
# do something with frame here
# Show output window
cv2.imshow("Output Frame", frame)
key = cv2.waitKey(1) & 0xFF
# check for 'q' key-press
if key == ord("q"):
#if 'q' key-pressed break out
break
# close output window
cv2.destroyAllWindows()
# safely close client
client.close()
NetGear as of now supports two ZeroMQ messaging patterns: i.e zmq.PAIR
and zmq.REQ and zmq.REP
and the supported protocol are: 'tcp' and 'ipc'
More advanced usage can be found here: https://abhitronix.github.io/vidgear/latest/gears/netgear/overview/
Solution 2:
It is because you are receiving small amount of data, and image is not complete. 8192 bytes is not enough in 99.99% of the time, because every image is larger than 8Kb. You'll need to grab ALL data sent by sender in order to convert it to image.
You can take a look at my code on github and change it acording to your need.
Long story short, easy option is to first send number of bytes to the client, and then send an image itself. In client code, after receiving length of image, loop until all bytes are received. for example:
...
img_len = 175428 # received by sender.py
e=0
data = ''
while e < img_len:
d = sock.recv(1024)
e += len(d)
data += d
nparr = np.fromstring(data, np.uint8)
frame = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
cv2.imshow('frame', frame)