3Dshader之球形环境映射(sphere environment mapping)

时间:2022-09-10 18:36:11

尼玛等了135天,终于第二个shader出炉了,哥,汗颜啊,真TM扯蛋,不过这几个月都在看一些基础图形的数学知识以及RTR,看了几本过后还是感觉实践才能真正理解所学的东西.好了废话不多说了,这个shader花了我三天的时间(准备的说是三个晚上)国际惯例,先上图:

 

 3Dshader之球形环境映射(sphere environment mapping)

 

 

换个角度

3Dshader之球形环境映射(sphere environment mapping)

再换个角度

 

3Dshader之球形环境映射(sphere environment mapping)

 

用的图为OGRE里面的那张spheremap.png

 

3Dshader之球形环境映射(sphere environment mapping)

 

 

 

下面这个是OGRE里面做出来的效果,它还加了个锈的贴图累加起来的,我就没有做累加了

 

3Dshader之球形环境映射(sphere environment mapping)

 

 

另外网上也可以找到另外一张球形环境纹理

3Dshader之球形环境映射(sphere environment mapping)

 

我们也用它来show一下(原谅我,好不容易自己写个shader,并且中间有个地方花了我一个小时才改出来,就让我多show一下吧- -!!)

各种角度:

3Dshader之球形环境映射(sphere environment mapping)

3Dshader之球形环境映射(sphere environment mapping)

3Dshader之球形环境映射(sphere environment mapping)

3Dshader之球形环境映射(sphere environment mapping)

 

OK,图就到此吧,下面开始讲下原理,呵呵!

 

Sphere mapping 和 Cube mapping是常见环境映射技术。Sphere map更早,是较早提出的环境映射的方法。

这里着重讲下 Sphere mapping 的原理。

首先给出它的GLSL代码

 

vec2 SphereMap(in vec3 ecPosition3,in vec3 normal)  

{

float m;

vec3 r, u;

u = normalize( ecPosition3 );

r = reflect( u, normal);
m = 2.0 * sqrt( r.x * r.x + r.y * r.y + ( r.z + 1.0 ) * ( r.z * 1.0 ) );

return vec2( r.x / m + 0.5, r.y / m + 0.5);

}


我们先会基于这个代码来讲解下原理

Sphere mapping 是基于这样一个事实:将一个理想高反射的球体置于场景*,从一个角度无穷远处拍摄此球体,将得到一张全景图。

例如:

3Dshader之球形环境映射(sphere environment mapping)

当然说是全景图,但是在球体背对拍摄方向的场景是看不到的,所以这个全景图是言过其实的,但是的实际效果上是不错的。

通常我们在场景建模中朝z轴正方向,利用正交投影模拟无穷远处进行渲染就可以得到这个纹理图 或者其它方法。

 

得到全景图纹理,如何将这个纹理应用与其他物体上呢?

举个例子,比如需要映射茶壶上的一个点(A点),首先计算眼睛(E点)在这点的反射向量r。如何计算r?

很简单 r = 2 * (v · n)*n - v

其中v为顶点A到E的单位方向向量:v =normalize( E - A);

n为顶点A的单位法线向量。 这样得到的r也为单位向量。

 

现在我们得到了茶壶上A点眼睛在这点的反射向量,那么如何从纹理上找出与之对应的点呢?

 

很简单,我们只需要找出纹理上的某点(B点),眼睛在B点的反射向量与A点中眼睛在A点的反射向量一致即可。

注意:这里观察A点和观察B点的眼睛坐标是不一样,另外A点的单位法向量已经知道,B点的单位法向量并不知道

 

所以,这里我们需要假设B点中眼睛的反射向量与A点中眼睛的反射向量一致即也为r,从而求出B点的球面坐标

 

这里最最重要的一点是需要知道的是:纹理是从(0,0,1)方向拍摄的(即B点眼睛的坐标)(注意DX里面是(0,0,-1)),所以将知道的反射光线和(0,0,1)相加,就得出了对应的法线值。 即(r.x,r.y,r.z+1)为B点法线的值,将它进行单位化(r.x,r.y,r.z+1)/sqrt(r.x*r.x+r.y*r.y+(r.z+1.0)*(r.z+1.0)).这样就得到B点处的单位法线向量n'。

由于点B是在球面上,所以单位法线向量可以想像成点B在单位球面上,为了从3维的球面(当然只是前半球)映射到2维的单位圆上(注意只是个圆,不是正方形),只需要取n'.x,n'.y为2维纹理坐标即可完成映射。

 

最后这样得到的2维纹理坐标范围为[-1,1],由于纹理坐标的范围为[0,1],我们需要将坐标范围缩小一半,并右移0.5就得到纹理坐标范围。 OK,整个流程结束!

 

原理讲完了,上代码吧!有些细节的地方我会在代码里标明:

/*------------------------------------------------------------
SphereEnvMapping.cpp -- achieve sphere environment mapping
(c) Seamanj.2013/7/23
------------------------------------------------------------*/
//phase1 : add teapot
//phase2 : add camera
//phase3 : add sphere environment mapping shader
#include "DXUT.h"
#include "resource.h"

#define phase1 1
#define phase2 1
#define phase3 1
#if phase1
// Global variables
ID3DXMesh* pTeapotMesh = 0;
#endif
#if phase2
#include "DXUTcamera.h"
CModelViewerCamera g_Camera;
#endif
#if phase3
#include "SDKmisc.h"
ID3DXEffect*g_pEffect = NULL; // D3DX effect interface
IDirect3DTexture9*g_pSphereEnvTex = 0;
D3DXHANDLEg_hTech = 0;
D3DXHANDLEg_hWorldViewProj = NULL; // Handle for world+view+proj matrix in effect
D3DXHANDLEg_hWorldView = NULL;
D3DXHANDLEg_hWorldViewInv = NULL;
D3DXHANDLEg_hSphereEnvTex = NULL;
#endif
//--------------------------------------------------------------------------------------
// Rejects any D3D9 devices that aren't acceptable to the app by returning false
//--------------------------------------------------------------------------------------
bool CALLBACK IsD3D9DeviceAcceptable( D3DCAPS9* pCaps, D3DFORMAT AdapterFormat, D3DFORMAT BackBufferFormat,
bool bWindowed, void* pUserContext )
{
// Typically want to skip back buffer formats that don't support alpha blending
IDirect3D9* pD3D = DXUTGetD3D9Object();
if( FAILED( pD3D->CheckDeviceFormat( pCaps->AdapterOrdinal, pCaps->DeviceType,
AdapterFormat, D3DUSAGE_QUERY_POSTPIXELSHADER_BLENDING,
D3DRTYPE_TEXTURE, BackBufferFormat ) ) )
return false;

return true;
}


//--------------------------------------------------------------------------------------
// Before a device is created, modify the device settings as needed
//--------------------------------------------------------------------------------------
bool CALLBACK ModifyDeviceSettings( DXUTDeviceSettings* pDeviceSettings, void* pUserContext )
{
#if phase2
pDeviceSettings->d3d9.pp.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE;
#endif
return true;
}


//--------------------------------------------------------------------------------------
// Create any D3D9 resources that will live through a device reset (D3DPOOL_MANAGED)
// and aren't tied to the back buffer size
//--------------------------------------------------------------------------------------
HRESULT CALLBACK OnD3D9CreateDevice( IDirect3DDevice9* pd3dDevice, const D3DSURFACE_DESC* pBackBufferSurfaceDesc,
void* pUserContext )
{
#if phase1
D3DXCreateTeapot( pd3dDevice, &pTeapotMesh, 0);
#endif
#if phase2
// Setup the camera's view parameters
D3DXVECTOR3 vecEye( 0.0f, 0.0f, -5.0f );
D3DXVECTOR3 vecAt ( 0.0f, 0.0f, -0.0f );
g_Camera.SetViewParams( &vecEye, &vecAt );
FLOAT fObjectRadius=1;
//摄像机缩放的3个参数
g_Camera.SetRadius( fObjectRadius * 3.0f, fObjectRadius * 0.5f, fObjectRadius * 10.0f );
g_Camera.SetEnablePositionMovement( true );
#endif
#if phase3
HRESULT hr;
// Create vertex shader
WCHAR str[MAX_PATH];
// Read the D3DX effect file
V_RETURN( DXUTFindDXSDKMediaFileCch( str, MAX_PATH, L"SphereEnvMapping.fx" ) );
// Create the effect
LPD3DXBUFFER pErrorBuff;
V_RETURN( D3DXCreateEffectFromFile(
pd3dDevice,// associated device
str,// effect filename
NULL,// no preprocessor definitions
NULL,// no ID3DXInclude interface
D3DXSHADER_DEBUG,// compile flags
NULL,// don't share parameters
&g_pEffect,// return effect
&pErrorBuff// return error messages
) );
//pErrorBuff
// Get handle
g_hTech = g_pEffect->GetTechniqueByName("myTechnique");
g_hWorldViewProj = g_pEffect->GetParameterByName(0, "g_mWorldViewProj");
g_hWorldView = g_pEffect->GetParameterByName(0, "g_mWorldView");
g_hWorldViewInv = g_pEffect->GetParameterByName(0, "g_mWorldViewInv");
g_hSphereEnvTex = g_pEffect->GetParameterByName(0, "g_txSphereEnvMap");

// Set texture:
D3DXCreateTextureFromFile(pd3dDevice, L"spheremap.bmp", &g_pSphereEnvTex);
#endif
return S_OK;
}


//--------------------------------------------------------------------------------------
// Create any D3D9 resources that won't live through a device reset (D3DPOOL_DEFAULT)
// or that are tied to the back buffer size
//--------------------------------------------------------------------------------------
HRESULT CALLBACK OnD3D9ResetDevice( IDirect3DDevice9* pd3dDevice, const D3DSURFACE_DESC* pBackBufferSurfaceDesc,
void* pUserContext )
{
#if phase3
HRESULT hr;
if( g_pEffect )
V_RETURN( g_pEffect->OnResetDevice() );
#endif
#if phase2
pd3dDevice->SetRenderState( D3DRS_CULLMODE, D3DCULL_NONE );
//关闭光照处理, 默认情况下启用光照处理
pd3dDevice->SetRenderState( D3DRS_LIGHTING, FALSE );
//Setup the camera's projection parameters
float fAspectRatio = pBackBufferSurfaceDesc->Width / ( FLOAT )pBackBufferSurfaceDesc->Height;

g_Camera.SetProjParams( D3DX_PI / 2, fAspectRatio, 0.1f, 5000.0f );
g_Camera.SetWindow( pBackBufferSurfaceDesc->Width, pBackBufferSurfaceDesc->Height );
g_Camera.SetButtonMasks( MOUSE_LEFT_BUTTON, MOUSE_WHEEL, MOUSE_RIGHT_BUTTON );
#endif

return S_OK;
}


//--------------------------------------------------------------------------------------
// Handle updates to the scene. This is called regardless of which D3D API is used
//--------------------------------------------------------------------------------------
void CALLBACK OnFrameMove( double fTime, float fElapsedTime, void* pUserContext )
{
#if phase2
g_Camera.FrameMove( fElapsedTime );
#endif
}


//--------------------------------------------------------------------------------------
// Render the scene using the D3D9 device
//--------------------------------------------------------------------------------------
void CALLBACK OnD3D9FrameRender( IDirect3DDevice9* pd3dDevice, double fTime, float fElapsedTime, void* pUserContext )
{
HRESULT hr;

// Clear the render target and the zbuffer
V( pd3dDevice->Clear( 0, NULL, D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER, D3DCOLOR_ARGB( 0, 45, 50, 170 ), 1.0f, 0 ) );
// Render the scene
if( SUCCEEDED( pd3dDevice->BeginScene() ) )
{
#if phase3
UINT iPass, cPasses;
D3DXMATRIXA16 mWorldViewProjection,mWorldView,mWorldViewInv;
V(g_pEffect->SetTechnique(g_hTech));
V( g_pEffect->Begin( &cPasses, 0 ) );
for( iPass = 0; iPass < cPasses; iPass++ )
{
V( g_pEffect->BeginPass( iPass ) );
//set WorldViewProject matrix
mWorldViewProjection = *g_Camera.GetWorldMatrix() * *g_Camera.GetViewMatrix() *
*g_Camera.GetProjMatrix();
V( g_pEffect->SetMatrix( g_hWorldViewProj, &mWorldViewProjection) );
//set WorldView matrix
mWorldView = *g_Camera.GetWorldMatrix() * *g_Camera.GetViewMatrix();
V( g_pEffect->SetMatrix( g_hWorldView, &mWorldView) );
//set WorldViewInv matrix
mWorldViewInv = *D3DXMatrixInverse(&mWorldViewInv, 0, &mWorldView);
V( g_pEffect->SetMatrix( g_hWorldViewInv, &mWorldViewInv) );
//set texture
V( g_pEffect->SetTexture( g_hSphereEnvTex, g_pSphereEnvTex) );


#if phase1
pTeapotMesh->DrawSubset( 0 );
#endif
V( g_pEffect->EndPass() );
}
V( g_pEffect->End() );
#endif
V( pd3dDevice->EndScene() );
}
}


//--------------------------------------------------------------------------------------
// Handle messages to the application
//--------------------------------------------------------------------------------------
LRESULT CALLBACK MsgProc( HWND hWnd, UINT uMsg, WPARAM wParam, LPARAM lParam,
bool* pbNoFurtherProcessing, void* pUserContext )
{
#if phase2
g_Camera.HandleMessages( hWnd, uMsg, wParam, lParam );
#endif
return 0;
}


//--------------------------------------------------------------------------------------
// Release D3D9 resources created in the OnD3D9ResetDevice callback
//--------------------------------------------------------------------------------------
void CALLBACK OnD3D9LostDevice( void* pUserContext )
{
#if phase3
if( g_pEffect )
g_pEffect->OnLostDevice();
#endif
}


//--------------------------------------------------------------------------------------
// Release D3D9 resources created in the OnD3D9CreateDevice callback
//--------------------------------------------------------------------------------------
void CALLBACK OnD3D9DestroyDevice( void* pUserContext )
{
#if phase1
SAFE_RELEASE(pTeapotMesh);
#endif
#if phase3
SAFE_RELEASE(g_pEffect);
SAFE_RELEASE(g_pSphereEnvTex);
#endif
}


//--------------------------------------------------------------------------------------
// Initialize everything and go into a render loop
//--------------------------------------------------------------------------------------
INT WINAPI wWinMain( HINSTANCE, HINSTANCE, LPWSTR, int )
{
// Enable run-time memory check for debug builds.
#if defined(DEBUG) | defined(_DEBUG)
_CrtSetDbgFlag( _CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF );
#endif

// Set the callback functions
DXUTSetCallbackD3D9DeviceAcceptable( IsD3D9DeviceAcceptable );
DXUTSetCallbackD3D9DeviceCreated( OnD3D9CreateDevice );
DXUTSetCallbackD3D9DeviceReset( OnD3D9ResetDevice );
DXUTSetCallbackD3D9FrameRender( OnD3D9FrameRender );
DXUTSetCallbackD3D9DeviceLost( OnD3D9LostDevice );
DXUTSetCallbackD3D9DeviceDestroyed( OnD3D9DestroyDevice );
DXUTSetCallbackDeviceChanging( ModifyDeviceSettings );
DXUTSetCallbackMsgProc( MsgProc );
DXUTSetCallbackFrameMove( OnFrameMove );

// TODO: Perform any application-level initialization here

// Initialize DXUT and create the desired Win32 window and Direct3D device for the application
DXUTInit( true, true ); // Parse the command line and show msgboxes
DXUTSetHotkeyHandling( true, true, true ); // handle the default hotkeys
DXUTSetCursorSettings( true, true ); // Show the cursor and clip it when in full screen
DXUTCreateWindow( L"SphereEnvMapping" );
DXUTCreateDevice( true, 640, 480 );

// Start the render loop
DXUTMainLoop();

// TODO: Perform any application-level cleanup here

return DXUTGetExitCode();
}



 

/*--------------------------------------------------------------------------
SphereEnvMapping.fx -- Sphere environment mapping shader
(c) Seamanj.2013/7/23
--------------------------------------------------------------------------*/

//--------------------------------------------------------------------------------------
// Global variables
//--------------------------------------------------------------------------------------
float4x4 g_mWorldViewProj;
float4x4 g_mWorldView;
float4x4 g_mWorldViewInv;
texture g_txSphereEnvMap;
//-----------------------------------------------------------------------------
// Sampler
//-----------------------------------------------------------------------------
sampler2D g_samShereEnvMap =
sampler_state
{
Texture = <g_txSphereEnvMap>;
MinFilter = Linear;
MagFilter = Linear;
MipFilter = Linear;
};

//--------------------------------------------------------------------------------------
// Vertex shader output structure
//--------------------------------------------------------------------------------------
struct VS_Output {
float4 position : POSITION;
float2 EnvTex : TEXCOORD0;
};


float2 SphereMap(float3 position, float3 normal)
{
float m;
float3 r,u;
u = normalize(position);
r = reflect(u, normal);
m = 2.0 * sqrt( r.x * r.x + r.y * r.y + (r.z - 1.0) * (r.z - 1.0) );//DX中这里是减
return float2(r.x / m + 0.5, r.y / m + 0.5);
}



//--------------------------------------------------------------------------------------
// Vertex shader
//--------------------------------------------------------------------------------------
VS_Output myVertexEntry(float4 position : POSITION,float3 normal : NORMAL)
{
VS_Output OUT;

OUT.position = mul ( position, g_mWorldViewProj);
position = mul( position, g_mWorldViewProj);
//normal = mul( normal, (float3x3)g_mWorldView);//这句太重要了,花了我一个小时,如果没这句,环境上的贴图会跟着茶壶转
//另外本来应该是乘以变换矩阵的转置的逆,但是这里只考虑了旋转对法向量的影响,毕竟平移对法向量没有影响
//所以只需要左上角3x3子矩阵的转置的逆,由于本例只包括了旋转所以左上角的3x3子矩阵是个正交矩阵,我们可以
//用法向量直接这个子矩阵,但是为了严谨我们还是乘以它的转置的逆(因为除了旋转和反射是正交矩阵,缩放和挤压可不是),
//然后再单位化下,真TM扯蛋,hlsl不支持矩阵的逆
normal = mul ( normal, transpose((float3x3)g_mWorldViewInv));
normal = normalize(normal);
OUT.EnvTex = SphereMap(position.xyz , normal);
return OUT;
}



//--------------------------------------------------------------------------------------
// Pixel shader
//--------------------------------------------------------------------------------------
float4 myPixelEntry(float2 Tex : TEXCOORD0) : COLOR
{

return tex2D(g_samShereEnvMap, Tex);
}


//--------------------------------------------------------------------------------------
// Renders scene to render target
//--------------------------------------------------------------------------------------
technique myTechnique
{
pass P0
{
VertexShader = compile vs_2_0 myVertexEntry();
PixelShader = compile ps_2_0 myPixelEntry();
}
}


 

 

 

可执行程序以及相关源代码请点击这里下载

 

不好意思,要收你10分,呵呵 ,因为我的分实在不够用了,好了,睡觉了,6点20了....