由于浏览器不支持对于rtmp协议推拉流,所以需要后台对传输的数据进行处理,将数据转接,为了实现其实时性,使用websocket将数据传输
先使用obs和vlc测试正常的推拉流是否正常
然后在跑本地后台传输视频
使用JavaCV技术传输音视频
//创建+设置采集器FFmpegFrameGrabber grabber = FFmpegFrameGrabber.createDefault(inputPath);grabber.setOption("rtsp_transport", "tcp");grabber.setImageWidth(960);grabber.setImageHeight(540);//开启采集器grabber.start();//直播播放窗口CanvasFrame canvasFrame = new CanvasFrame("直播------来自"+inputPath);canvasFrame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);canvasFrame.setAlwaysOnTop(true);OpenCVFrameConverter.ToMat converter = new OpenCVFrameConverter.ToMat();//播流while (true){Frame frame = grabber.grabImage(); //拉流opencv_core.Mat mat = converter.convertToMat(frame);canvasFrame.showImage(frame); //播放}
测试时正常显示页面
对音视频解析转为二进制流传递给前端
//读取rtmp文件流// 获取视频源log.info("音频流地址{}",url);FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(url);try {grabber.start();Frame frame = null;long startTime = System.currentTimeMillis();ByteArrayOutputStream wavBuffer = new ByteArrayOutputStream();while ((frame = grabber.grabFrame()) != null) {log.info("解流");Buffer[] samples = frame.samples;int sampleRate = frame.sampleRate;int audioChannels = frame.audioChannels;int sampleSizeInBit = 16;//一秒数据量Thread.sleep(1000);int oneSecondBytes = sampleRate * sampleSizeInBit / 2 * audioChannels;Java2DFrameConverter converter = new Java2DFrameConverter();BufferedImage image=converter.convert(frame);ByteArrayOutputStream baos = new ByteArrayOutputStream();ImageIO.write(image, "jpg", baos);byte[] imageData = baos.toByteArray();webSocketService.sendBroadcast(imageData);
设置1秒时长的延迟,防止服务器因为拉流崩溃