视音频数据处理入门:颜色空间(二)---ffmpeg

2023-05-16

目录

概述

流程

相关流程

初始化方法

初始化代码

转换方法

转换代码

释放方法

整体代码介绍

代码路径


概述

本篇简单说一下基于FFmpeg的libswscale的颜色空间转换;Libswscale里面实现了各种图像像素格式的转换,例如:YUV与RGB之间的转换等;这里简单说以下Libswscale的颜色空间转换的使用方法。

流程

相关流程

Libswscale使用起来很方便,最主要的函数只有3个:
(1)       sws_getContext():使用参数初始化SwsContext结构体。
(2)       sws_scale():转换一帧图像。
(3)       sws_freeContext():释放SwsContext结构体。
其中sws_getContext()也可以用另一个接口函数sws_getCachedContext()取代。

初始化方法

初始化SwsContext我们这里选用sws_getContext();除了上述函数之外还有另一种方法,更加灵活,可以配置更多的参数。该方法调用的函数如下所示:
1)  sws_alloc_context():为SwsContext结构体分配内存。
2)  av_opt_set_XXX():通过av_opt_set_int(),av_opt_set()…等等一系列方法设置SwsContext结构体的值。在这里需要注意,SwsContext结构体的定义看不到,所以不能对其中的成员变量直接进行赋值,必须通过av_opt_set()这类的API才能对其进行赋值。
3)  sws_init_context():初始化SwsContext结构体。
与第一种方式相比这种复杂的方法可以配置一些sws_getContext()配置不了的参数。比如说设置图像的YUV像素的取值范围是JPEG标准(Y、U、V取值范围都是0-255)还是MPEG标准(Y取值范围是16-235,U、V的取值范围是16-240)。

初始化代码

m_imgConvertCtx = sws_getContext(cfg->srcWide, cfg->srcHigh, srcIter->second,
		cfg->dstWide, cfg->dstHigh, dstIter->second, SWS_BICUBIC, NULL, NULL, NULL);

转换方法

对于转换函数就没有特别说明的了调用sws_scale();值得注意就是这个函数的参数的传递需要按照对应的颜色空间进行排列;

转换代码

sws_scale(m_imgConvertCtx, m_srcPointers, m_srcLinesizes, 0, m_srcHigh, m_dstPointers, m_dstLinesizes);

释放方法


	if (nullptr != m_imgConvertCtx)
	{
		sws_freeContext(m_imgConvertCtx);
	}

	m_imgConvertCtx = nullptr;

整体代码介绍

对应FFmpeg 颜色空间转换的demo这边封装成了一个类函数,主要提供了  NV12、NV21、YUV420P、YUV422P、RGB24、RGBA相互转换功能。如需扩展则实现对应函数即可。

头文件:ColorConversionFFmpeg.h

/**
 * FFmpeg的颜色空间转换
 * YUV Transformation
 *
 * 梁启东 qidong.liang
 * 18088708700@163.com
 * https://blog.csdn.net/u011645307
 *
 *
 * 本程序实现了FFmpeg的YUV数据之间的转换和YUV与RGB的转换。
 * 提供了如下:
 * 	FFMPEG_AV_PIX_FMT_NOKNOW,
 *	FFMPEG_AV_PIX_FMT_NV12,
 *	FFMPEG_AV_PIX_FMT_NV21,
 *	FFMPEG_AV_PIX_FMT_YUV420P,
 *	FFMPEG_AV_PIX_FMT_YUV422P,
 *	FFMPEG_AV_PIX_FMT_RGB24,
 *	FFMPEG_AV_PIX_FMT_RGBA
 *  相互转换功能
 */
#ifndef COLOR_CONVERSION_FFMPEG_H
#define	COLOR_CONVERSION_FFMPEG_H

#ifdef _WIN32
//Windows
extern "C"
{
#include "libswscale/swscale.h"
#include "libavutil/opt.h"
#include "libavutil/imgutils.h"
};
#else
//Linux...
#ifdef __cplusplus
extern "C"
{
#endif
#include <libavutil/opt.h>
#include <libswscale/swscale.h>
#include <libavutil/imgutils.h>
#ifdef __cplusplus
};
#endif
#endif

#include <map>
#include <functional>

#ifndef FFMPEG_PIX_FORMAT
#define	FFMPEG_PIX_FORMAT
typedef enum FFmpegAVPixelFormat
{
	FFMPEG_AV_PIX_FMT_NOKNOW,
	FFMPEG_AV_PIX_FMT_NV12,
	FFMPEG_AV_PIX_FMT_NV21,
	FFMPEG_AV_PIX_FMT_YUV420P,
	FFMPEG_AV_PIX_FMT_YUV422P,
	FFMPEG_AV_PIX_FMT_RGB24,
	FFMPEG_AV_PIX_FMT_RGBA

}FFmpegAVPixelFormat;

#endif//FFMPEG_PIX_FORMAT
#ifndef FFMPEG_SCALE_CONFIG
#define	FFMPEG_SCALE_CONFIG
typedef struct FFmpegSwscaleConfig
{
	unsigned int srcWide;
	unsigned int srcHigh;
	FFmpegAVPixelFormat srcFormat;
	unsigned int dstWide;
	unsigned int dstHigh;
	FFmpegAVPixelFormat dstFormat;

	FFmpegSwscaleConfig()
	{
		srcWide = 0;
		srcHigh = 0;
		srcFormat = FFMPEG_AV_PIX_FMT_NOKNOW;
		dstWide = 0;
		dstHigh = 0;
		dstFormat = FFMPEG_AV_PIX_FMT_NOKNOW;
	}
}FFmpegSwscaleConfig;
#endif // !FFMPEG_SCALE_CONFIG

class ColorConversionFFmpeg
{
public:
	ColorConversionFFmpeg();
	~ColorConversionFFmpeg();

	long Init(FFmpegSwscaleConfig* cfg);
	long Conversion(const char* inputBuff, char* outputBuff);
	long UnInit();

private:
	long BuffToAVPixFmtYUV420P(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtRGBA(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtRGB24(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtNV12(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtNV21(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtYUV422P(char* inputBuff, unsigned char** pixBuff);

	long AVPixFmtYUV420PToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtNV12ToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtNV21ToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtYUV422PToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtRGB24ToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtRGBAToBuff(unsigned char** pixBuff, char* outputBuff);

private:
	SwsContext* m_imgConvertCtx;
	uint8_t* m_srcPointers[4]{ nullptr,nullptr,nullptr,nullptr };
	int m_srcLinesizes[4]{0,0,0,0};
	uint8_t* m_dstPointers[4]{ nullptr,nullptr,nullptr,nullptr };
	int m_dstLinesizes[4]{ 0,0,0,0 };

	int m_srcHigh;
	int m_srcWide;
	std::function < long(char* inputBuff, unsigned char** pixBuff) > m_infun;
	std::function < long(unsigned char** pixBuff, char* outputBuff) > m_outfun;
	std::map<FFmpegAVPixelFormat, AVPixelFormat>			m_PixelFormatMap;
	std::map<FFmpegAVPixelFormat,
		std::function < long(
			char* inputBuff,
			unsigned char** pixBuff) >>					    m_srcFormatFunMap;
	std::map<FFmpegAVPixelFormat,
		std::function < long(
			unsigned char** pixBuff,
			char* outputBuff) >>						    m_dstFormatFunMap;
};
#endif//COLOR_CONVERSION_FFMPEG_H

源文件:ColorConversionFFmpeg.cpp

#include "ColorConversionFFmpeg.h"

ColorConversionFFmpeg::ColorConversionFFmpeg()
	: m_imgConvertCtx(nullptr)
	, m_infun(nullptr)
	, m_outfun(nullptr)
	, m_srcHigh(0)
	, m_srcWide(0)
{

	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_NV12, AV_PIX_FMT_NV12));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_NV21, AV_PIX_FMT_NV21));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_YUV420P, AV_PIX_FMT_YUV420P));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_YUV422P, AV_PIX_FMT_YUV422P));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_RGB24, AV_PIX_FMT_RGB24));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_RGBA, AV_PIX_FMT_BGRA));

	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_NV12] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtNV12,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_NV21] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtNV21,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_YUV420P] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtYUV420P,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_YUV422P] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtYUV422P,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_RGB24] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtRGB24,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_RGBA] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtRGBA,
		this,
		std::placeholders::_1,
		std::placeholders::_2);


	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_NV12] = std::bind(&ColorConversionFFmpeg::AVPixFmtNV12ToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_NV21] = std::bind(&ColorConversionFFmpeg::AVPixFmtNV21ToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_YUV420P] = std::bind(&ColorConversionFFmpeg::AVPixFmtYUV420PToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_YUV422P] = std::bind(&ColorConversionFFmpeg::AVPixFmtYUV422PToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_RGB24] = std::bind(&ColorConversionFFmpeg::AVPixFmtRGB24ToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_RGBA] = std::bind(&ColorConversionFFmpeg::AVPixFmtRGBAToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);

}

ColorConversionFFmpeg::~ColorConversionFFmpeg()
{
	m_PixelFormatMap.clear();
	m_srcFormatFunMap.clear();
	m_dstFormatFunMap.clear();

}

long ColorConversionFFmpeg::Init(FFmpegSwscaleConfig* cfg)
{
	if (nullptr == cfg)
	{
		return -1;
	}
	auto srcIter = m_PixelFormatMap.find(cfg->srcFormat);
	auto dstIter = m_PixelFormatMap.find(cfg->dstFormat);
	if (srcIter == m_PixelFormatMap.end() ||
		dstIter == m_PixelFormatMap.end())
	{
		return -2;
	}
	auto srcFormatFunIter = m_srcFormatFunMap.find(cfg->srcFormat);
	auto dstFormatFunIter = m_dstFormatFunMap.find(cfg->dstFormat);
	if (dstFormatFunIter == m_dstFormatFunMap.end() ||
		srcFormatFunIter == m_srcFormatFunMap.end())
	{
		return -3;
	}

	m_infun = srcFormatFunIter->second;
	m_outfun = dstFormatFunIter->second;

	int nSrctBuffLen = 0, nDstBuffLen = 0;

	nSrctBuffLen = av_image_alloc(m_srcPointers, m_srcLinesizes, cfg->srcWide, cfg->srcHigh, srcIter->second, 1);
	if (nSrctBuffLen <= 0)
	{
		return -4;
	}
	nDstBuffLen = av_image_alloc(m_dstPointers, m_dstLinesizes, cfg->dstWide, cfg->dstHigh, dstIter->second, 1);
	if (nDstBuffLen <= 0 )
	{
		av_freep(&m_srcPointers[0]);
		return -5;
	}

	m_imgConvertCtx = sws_getContext(cfg->srcWide, cfg->srcHigh, srcIter->second,
		cfg->dstWide, cfg->dstHigh, dstIter->second, SWS_BICUBIC, NULL, NULL, NULL);
	
	if (nullptr == m_imgConvertCtx)
	{
		av_freep(&m_srcPointers);
		av_freep(&m_dstPointers);
		return -6;
	}

	m_srcHigh = cfg->srcHigh;
	m_srcWide = cfg->srcWide;

	return 0;
}

long ColorConversionFFmpeg::Conversion(const char* inputBuff, char* outputBuff)
{
	if (nullptr == m_infun ||
		nullptr == m_outfun ||
		nullptr == m_dstPointers[0] ||
		nullptr == m_srcPointers[0] ||
		nullptr == m_imgConvertCtx)
	{
		return 0;
	}
	
	m_infun(const_cast<char*>(inputBuff), m_srcPointers);

	sws_scale(m_imgConvertCtx, m_srcPointers, m_srcLinesizes, 0, m_srcHigh, m_dstPointers, m_dstLinesizes);

	m_outfun(m_dstPointers, outputBuff);
	return 0;
}

long ColorConversionFFmpeg::UnInit()
{
	if (m_srcPointers)
	{
		av_freep(&m_srcPointers);
	}
	if (m_dstPointers)
	{
		av_freep(&m_dstPointers);
	}

	m_dstPointers[0] = nullptr;
	m_srcPointers[0] = nullptr;

	if (nullptr != m_imgConvertCtx)
	{
		sws_freeContext(m_imgConvertCtx);
	}

	m_imgConvertCtx = nullptr;
	m_outfun = nullptr;
	m_infun = nullptr;

	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtYUV420P(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, static_cast<size_t>(m_srcWide * m_srcHigh));											//Y
	memcpy(pixBuff[1], inputBuff + m_srcWide * m_srcHigh, m_srcWide * m_srcHigh / 4);				//U
	memcpy(pixBuff[2], inputBuff + m_srcWide * m_srcHigh * 5 / 4, m_srcWide * m_srcHigh / 4);		//V
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtRGBA(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcWide * m_srcHigh*4);
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtRGB24(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcWide * m_srcHigh * 3);
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtNV12(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcHigh*m_srcWide);                    //Y
	memcpy(pixBuff[1], inputBuff + m_srcHigh * m_srcWide, m_srcHigh*m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtNV21(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcHigh * m_srcWide);                    //Y
	memcpy(pixBuff[1], inputBuff + m_srcHigh * m_srcWide, m_srcHigh * m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtYUV422P(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcWide * m_srcHigh);											//Y
	memcpy(pixBuff[1], inputBuff + m_srcWide * m_srcHigh, m_srcWide * m_srcHigh / 2);				//U
	memcpy(pixBuff[2], inputBuff + m_srcWide * m_srcHigh * 3 / 2, m_srcWide * m_srcHigh / 2);		//V
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtYUV420PToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh);											//Y
	memcpy(outputBuff + m_srcWide * m_srcHigh, pixBuff[1], m_srcWide * m_srcHigh / 4);				//U
	memcpy(outputBuff + m_srcWide * m_srcHigh * 5 / 4, pixBuff[2], m_srcWide * m_srcHigh / 4);		//V
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtNV12ToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy( outputBuff, pixBuff[0], m_srcHigh * m_srcWide);                    //Y
	memcpy( outputBuff + m_srcHigh * m_srcWide, pixBuff[1], m_srcHigh * m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtNV21ToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcHigh * m_srcWide);                    //Y
	memcpy(outputBuff + m_srcHigh * m_srcWide, pixBuff[1], m_srcHigh * m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtYUV422PToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh);											//Y
	memcpy(outputBuff + m_srcWide * m_srcHigh, pixBuff[1], m_srcWide * m_srcHigh / 2);				//U
	memcpy(outputBuff + m_srcWide * m_srcHigh * 3 / 2, pixBuff[2], m_srcWide * m_srcHigh / 2);		//V
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtRGB24ToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh * 3);
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtRGBAToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh * 4);
	return 0;
}

测试文件:main.cpp

/**
* FFmpeg的颜色空间转换的测试程序
* YUV Transformation
*
* 梁启东 qidong.liang
* 18088708700@163.com
* https://blog.csdn.net/u011645307
*
*
* FFmpeg的颜色空间转换的测试程序
*/

#include <iostream>
#include "ColorConversionFFmpeg.h"


#define NV12_To_I420	0
#define I420_To_NV12	0
#define NV21_To_I420	0
#define I420_To_NV21	0
#define I420_To_RGB32	0
#define RGB32_To_I420	0
#define I420_To_RGB24	0
#define RGB24_To_I420	0
#define NV12_To_YUV422P	0
#define YUV422P_To_NV12	1
int main()
{
	FILE* file_in = nullptr;
	FILE* file_out = nullptr;
	char* input_name = nullptr;
	char* output_name = nullptr;

	int w = 0, h = 0;
	float flotScale = 0;
	int out_w = 0, out_h = 0;
	float out_flotScale = 0;

	FFmpegSwscaleConfig cfg;
	ColorConversionFFmpeg obj;

#if NV12_To_YUV422P
	input_name = const_cast<char*>("../in/nv21_480x272.yuv");
	output_name = const_cast<char*>("../out/yuvv422p_480x272.yuv");

	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_NV12;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV422P;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 2;

#endif

#if YUV422P_To_NV12

	input_name = const_cast<char*>("../in/YV16(422)_480x272.yuv");
	output_name = const_cast<char*>("../out/nv21_480x272.yuv");
	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV422P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_NV12;

	w = 480;
	h = 272;
	flotScale = 2;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if NV21_To_I420
	input_name = const_cast<char*>("../in/nv21_480x272.yuv");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_NV21;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if I420_To_NV21

	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/nv21_480x272.yuv");
	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_NV21;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if NV12_To_I420
	input_name = const_cast<char*>("../in/nv12_480x272.yuv");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_NV12;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if I420_To_NV12

	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/nv12_480x272.yuv");
	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_NV12;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if I420_To_RGB24
	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/rgb_480x272.rgb");

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 3;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_RGB24;


#endif

#if RGB24_To_I420
	input_name = const_cast<char*>("../in/rgb_480x272.rgb");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	w = 480;
	h = 272;
	flotScale = 3;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_RGB24;

#endif

#if I420_To_RGB32
	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/rgba_480x272.rgb");

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 4;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_RGBA;

#endif

#if RGB32_To_I420
	input_name = const_cast<char*>("../in/rgba_480x272.rgb");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	w = 480;
	h = 272;
	flotScale = 4;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_RGBA;

#endif

	int in_buff_len = w * h * flotScale;
	int out_buff_len = out_w * out_h * out_flotScale;
	char* inbuff = new char[in_buff_len];
	char* outbuff = new char[out_buff_len];
	fopen_s(&file_in, input_name, "rb+");
	fopen_s(&file_out, output_name, "wb+");


	int ret = obj.Init(&cfg);
	if (0 != ret)
	{
		printf("ColorConversionFFmpeg::Init ret:%d\n", ret);
		fclose(file_in);
		fclose(file_out);
		file_in = nullptr;
		file_out = nullptr;
		return -1;
	}
	while (true)
	{
		if (fread(inbuff, 1, in_buff_len, file_in) != in_buff_len)
		{
			break;
		}

		ret = obj.Conversion(inbuff, outbuff);
		if (0 != ret)
		{
			printf("ColorConversionFFmpeg::Conversion ret:%d\n", ret);
			continue;
		}
		fwrite(outbuff, 1, out_buff_len, file_out);
	}
	ret = obj.UnInit();
	if (0 != ret)
	{
		printf("ColorConversionFFmpeg::UnInit ret:%d\n", ret);
	}
	fclose(file_in);
	fclose(file_out);
	file_in = nullptr;
	file_out = nullptr;


    std::cout << "Hello World!\n";
}

代码路径

csdn:https://download.csdn.net/download/u011645307/21739481?spm=1001.2014.3001.5501

github:https://github.com/liangqidong/ColorConversion.git

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

视音频数据处理入门:颜色空间(二)---ffmpeg 的相关文章

随机推荐

  • Kali打包APK报错,Using Apktool 2.x.x-dirty org/apache/commons/text/StringEscapeUtils

    Kali xff08 2021 4a 2022 03 xff09 的apktools工具有问题 先卸载 xff1a apt get purge remove apktool 重新安装 xff1a 下载 Linux版wrapper scrip
  • I2C时钟延展

    转载自http blog sina com cn s blog 15fd81ac70102wvgw html xff0c 本文仅作为笔记备份 什么是I2C时钟延展 xff08 SCL Stretching xff09 xff1f 在I2C的
  • 树莓派 设置wifi 优先于有线网口

    亲测了好使 我的dhcpcd conf是这样的 成功之后是这样的 wlan0在上面 pi 64 raspberrypi ip route show default via 192 168 77 1 dev wlan0 src 192 168
  • nginx配置之调试配置

    用于调试和定位的问题的配置项 是否以守护进程方式运行Nginx 语法 xff1a daemon on off 默认 xff1a daemon on 作用 xff1a 守护进程是可以脱离终端并且在后台运行的进程 他脱离是为了避免进程执行过程中
  • python之while语句详解

    python之while语句详解 1 基本介绍2 while语句练习2 1 求100以内所有奇数或偶数之和2 2 求100以内9的倍数之和 xff0c 以及个数2 3 输出九九乘法表2 4 猜数字2 5 循环嵌套 1 基本介绍 xff08
  • 使用栈判断回文

    一 背景 什么是回文 xff1f 比如abba abbba 1221等 xff0c 从前读和从后读都一样 xff0c 这就是回文 abab就不是回文 xff0c 因为从前读和从后读不一样 那么 xff0c 你能够写一个程序判断一个字符串是否
  • ant-design-vue 日期组件国际化

    在入口文件main js中 import moment from 39 moment 39 import 39 moment locale zh cn 39 moment locale 39 zh cn 39 其中moment函数可以将日期
  • linux 查看启动项

    查看启动项 chkconfig list chkconfig level x name on off z B chkconfig level 5 openvpn off 以上的命令可以查询系统可提供的服务 xff0c 如果希望开机时启动某一
  • 解决报错libssl.so.1.1: cannot open shared object file: No such file or directory

    解决报错libssl so 1 1 cannot open shared object file No such file or directory Linux运维 更新于 2020年8月25日 0 条评论 Centos7 默认提供的 op
  • nginx代理下django debug toolbar不显示

    nginx代理的django服务 xff0c 平时正常 xff0c 今天不显示 xff0c 不用nginx代理正常 xff0c 查了半天 xff0c 突然想起来上午把nginx代理的静态文件下的debug toolbar静态文件夹给删了 x
  • Hadoop中的HDFS文件下载到远程机

    使用Hadoop下载文件不落地直接到远程服务器 xff0c 使用到hadoop api和JSch 欢迎使用Markdown编辑器新的改变功能快捷键合理的创建标题 xff0c 有助于目录的生成如何改变文本的样式插入链接与图片如何插入一段漂亮的
  • cookie 存放地点

    什么是Cookie xff1f A cookie also known as an HTTP cookie web cookie or browser cookie is a small piece of data sent from a
  • boost使用之编译库及遇到的问题

    最近因为在学习网络编程相关的东西 xff0c 准备学习一下boost xff0c 毕竟原生的网络编程太麻烦 看了一下其实windows下想使用起来很简单 xff0c 就是下载库 xff0c 然后运行脚本 xff0c 然后运行exe库就出来
  • windows消息机制(MFC)

    消息分类与消息队列 Windows中 xff0c 消息使用统一的结构体 xff08 MSG xff09 来存放信息 xff0c 其中message表明消息的具体的类型 xff0c 而wParam xff0c lParam是其最灵活的两个变量
  • POSTGRESQL表、字段添加注释和查询注释

    postgresql的注释工具层面的支持并不友好 xff0c 因此可采用命令的形式来进行字段 表进行添加注释 同时 xff0c 也可以通过一条SQL语句来查询字段的注释和类型 首先我们来看添加注释 xff1a 表添加注释 comment o
  • Armbian bullseye 系统OMV 6.x安装分享

    OMV 5 x网上教程很多 6 x的官方有方法 xff0c 但是因为墙的原因 xff0c 要换源 对初学者来说并没有一份完全照抄的教程参考 经过一番摸索 总结了下OMV 6 x的安装过程如下 第一步当然是Armbian系统烧录 这步网上教程
  • 使用Go语言 在windows下 实现隐藏进程命令行参数 保护密码等数据

    C语言在unix下可以通过直接覆写argv的方式隐藏参数 xff0c 但是在windows下由于win32 api的限制 xff0c 获取到的参数是一串连续的字符串 xff0c 在C语言的main函数调用之前已经由C标准库实现了分割 xff
  • Linux 环境部署-- gitlab

    一 安装并配置必要的依赖关系 1 安装ssh yum install y curl policycoreutils pythonopenssh server 2 将SSH服务设置成开机自启动 xff0c 安装命令 xff1a sudo sy
  • Mysql先排序后分组(全网最有效最简单的办法)

    Mysql先排序后分组 mysql常见的排序分组是使用子查询先排序再分组 xff0c 我们来用另外一种方式实现简单的分组排序 1 创建测试数据表 此步骤省略 2 生成测试数据 此步骤省略 3 直接上查询代码 第一层 xff1a 查询基础数据
  • 视音频数据处理入门:颜色空间(二)---ffmpeg

    目录 概述 流程 相关流程 初始化方法 初始化代码 转换方法 转换代码 释放方法 整体代码介绍 代码路径 概述 本篇简单说一下基于FFmpeg的libswscale的颜色空间转换 xff1b Libswscale里面实现了各种图像像素格式的