Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CVPixelBuffer to WebSocket #144

Open
yigitarik opened this issue Dec 13, 2023 · 2 comments
Open

CVPixelBuffer to WebSocket #144

yigitarik opened this issue Dec 13, 2023 · 2 comments

Comments

@yigitarik
Copy link

yigitarik commented Dec 13, 2023

Hi everyone. I'm trying to send CVPixelBuffer to my websocket. But websocket has few data types for sending like "data, ping, pong, string". So when I try to convert that buffer data It returns empty. My goal is record the meeting in the server. At this point I could not send buffer data to my websocket. Can anyone help me? @stasel @kmadiar

@stasel
Copy link
Owner

stasel commented Dec 19, 2023

Hey, sending Data through websockets should work. Can you share your code so others can get more context of what you are trying to achieve?

@yigitarik
Copy link
Author

Hi again. Ok I am sharing my code.

This is my capturer code. In the end of this code block as you can see I added FrameRenderer() class to localVideTrack.

func startCaptureLocalVideo(renderer: RTCVideoRenderer) {
        
        guard let capturer = self.videoCapturer as? RTCCameraVideoCapturer else {
            return
        }

        guard
            let frontCamera = (RTCCameraVideoCapturer.captureDevices().first { $0.position == .front }),
        
            // choose highest res
            let format = (RTCCameraVideoCapturer.supportedFormats(for: frontCamera).sorted { (f1, f2) -> Bool in
                let width1 = CMVideoFormatDescriptionGetDimensions(f1.formatDescription).width
                let width2 = CMVideoFormatDescriptionGetDimensions(f2.formatDescription).width
                return width1 < width2
            }).last,
        
        
            // choose highest fps
            let fps = (format.videoSupportedFrameRateRanges.sorted { return $0.maxFrameRate < $1.maxFrameRate }.last) else {
            return
        }
        
        
        capturer.startCapture(with: frontCamera,
                              format: format,
                              fps: Int(fps.maxFrameRate))
        bufferDelegate?.capturer(capturer, didCapture: videoFrame!)
        localCapturer?.startCapture(with: frontCamera, format: format, fps: Int(fps.maxFrameRate))
            
        self.localVideoTrack?.add(renderer)
        
        self.localVideoTrack?.add(FrameRenderer())
        
        
    }

FrameRenderer() class is like:

class FrameRenderer: NSObject, RTCVideoRenderer {

    let manager = SocketManager(
        socketURL: URL(string: "http://socket.clumpapp.com")!,
        config: [.log(false), .compress, .secure(false), .selfSigned(true), .forcePolling(true) ])
    var socket:SocketIOClient!

//    private var signalClient: SignalingClient
    override init() {
        socket = manager.defaultSocket
        socket.connect()
        socket.on(clientEvent: .connect) {data, ack in
            print("Clump connected")
        }
        socket.on(clientEvent: .error) {data, ack in
            print("error")
        }
        super.init()
    }
    
    weak var delegate: FrameRendererDelegate?

    func setSize(_ size: CGSize) {
        print(size.width)
        print(size.height)
    }
     
    func renderFrame(_ frame: RTCVideoFrame?) {
        
        guard let rtcPixelBuffer = frame!.buffer as? RTCCVPixelBuffer else {
                print("Error: frame.buffer is not an RTCCVPixelBuffer")
                return
            }
            
            let imageBuffer = rtcPixelBuffer.pixelBuffer
             
            print(imageBuffer)
            
            var jsonString = convertPixelBufferToJSONData(pixelBuffer: imageBuffer)
            
            socket.emit("joinRoom", [jsonString])
        
    }
}

This socket is my example socket for testing. I am sending the data to joinRoom event.

And my convert function:



func convertPixelBufferToJSONData(pixelBuffer: CVPixelBuffer) -> Data? {
    var data = Data()
    data = Data()
    print("-*-*-*")
    CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.readOnly)
    let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)
    CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.readOnly)

    let bufferSize = CVPixelBufferGetDataSize(pixelBuffer)
    data = Data(bytes: baseAddress!, count: bufferSize)
    print(baseAddress)
    print(data)
    print("-*-*-*")
   
    let jsonDictionary: [String: Any] = [
        "pixelBufferData": data.base64EncodedString()
    ]
    
    guard let jsonData = try? JSONSerialization.data(withJSONObject: jsonDictionary, options: []) else {
        print("nil")
        return nil
    }
    
    return jsonData
}

This convert function not working correctly I guess. As I said my sole purpose is recording the meeting on server simultaneously. I am open any other advice to do it. Thanks. @stasel

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants