讲了很多,最基础的部分就剩下纹理没有讲到了。texture是directx里面非常重要的一部分。为了简便起见,我们还是以sdk的tutorial5为例子。
纹理就像一张墙纸,用来贴在物体的表面,当然,如果足够大,贴一次就能覆盖整个物体的表面,也可以用适当的方法让纹理排列成你要的效果。
来看看纹理的比较重要的函数:device.settexture
public void settexture(
int stage, //纹理混合阶段序号,从0开始
basetexture texture //要设置的纹理对象
);
public void settexturestagestate(
int stage, //纹理混合阶段序号
texturestagestates state, // texturestagestates enumeration的成员
int value //对应阶段状态的值
);
settexturestagestate函数对处理不同的纹理坐标,颜色操作,alpha操作,和凹凸映射/环境映射比较适用,但是这些操作只对dx9的固定功能的多纹理单元有效,不能将他们与像素shader连用。
public void setsamplerstate(
int stage, //纹理混合阶段序号
samplerstagestates state, // samplerstagestates enumeration的成员
int value //对应采样器状态的值
);
知道了这些下面读懂这些代码就很容易了,我们需要建立vertex,这里我们需要有一点点地改变,在以前我们接触到的vertex里面都不涉及到纹理,所以我们选择了customvertex里面不包括纹理的类型,现在我们要用customvertex.positionnormaltextured,从名字就可以看出来,这个类型包括了法线还包括了位置的x,y,z,以及纹理坐标的tu和tv。
当然如果使用customvertex.positiontextured 也是可以的,它不包括法线信息。
接下来我们需要为每个vertex指定信息,我们先打断一下讲讲纹理坐标,为了通过指定纹理坐标来访问纹理中的每个图素,dx采用了一个一般化的编址方案,纹理地址由[0.0,1.0]区间内的坐标组成,这样我们就不用关心纹理的实际尺寸,例如可以使用(0.0f,0.0f) ,(1.0f,0.0f),(1.0f,1.0f),(0.0f,1.0f)把一个纹理贴到一个矩形上,同样如果(0.0f,0.0f) ,(0。5f,0.0f),(0.5,1.0f),(0.0f,1.0f)就是纹理的左半边。
我们可以通过textureloader.fromfile方法来读入图片作为纹理。
这里代码很简单里面有详细的注释,我就不多讲了,
//-----------------------------------------------------------------------------
// file: texture.cs
//
// desc: better than just lights and materials, 3d objects look much more
// convincing when texture-mapped. textures can be thought of as a sort
// of wallpaper, that is shrinkwrapped to fit a texture. textures are
// typically loaded from image files, and d3dx provides a utility to
// function to do this for us. like a vertex buffer, textures have
// lock() and unlock() functions to access (read or write) the image
// data. textures have a width, height, miplevel, and pixel format. the
// miplevel is for "mipmapped" textures, an advanced performance-
// enhancing feature which uses lower resolutions of the texture for
// objects in the distance where detail is less noticeable. the pixel
// format determines how the colors are stored in a texel. the most
// common formats are the 16-bit r5g6b5 format (5 bits of red, 6-bits of
// green and 5 bits of blue) and the 32-bit a8r8g8b8 format (8 bits each
// of alpha, red, green, and blue).
//
// textures are associated with geometry through texture coordinates.
// each vertex has one or more sets of texture coordinates, which are
// named tu and tv and range from 0.0 to 1.0. texture coordinates can be
// supplied by the geometry, or can be automatically generated using
// direct3d texture coordinate generation (which is an advanced feature).
//
// copyright (c) microsoft corporation. all rights reserved.
//-----------------------------------------------------------------------------
using system;
using system.drawing;
using system.windows.forms;
using microsoft.directx;
using microsoft.directx.direct3d;
using direct3d=microsoft.directx.direct3d;
namespace texturetutorial
{
public class textures : form
{
// our global variables for this project
device device = null; // our rendering device
vertexbuffer vertexbuffer = null;
texture texture = null;
presentparameters presentparams = new presentparameters();
bool pause = false;
public textures()
{
// set the initial size of our form
this.clientsize = new system.drawing.size(400,300);
// and its caption
this.text = "direct3d tutorial 5 - textures";
}
public bool initializegraphics()
{
try
{
presentparams.windowed=true; // we don't want to run fullscreen
presentparams.swapeffect = swapeffect.discard; // discard the frames
presentparams.enableautodepthstencil = true; // turn on a depth stencil
presentparams.autodepthstencilformat = depthformat.d16; // and the stencil format
device = new device(0, devicetype.hardware, this, createflags.softwarevertexprocessing, presentparams); //create a device
device.devicereset += new system.eventhandler(this.onresetdevice);
this.oncreatedevice(device, null);
this.onresetdevice(device, null);
pause = false;
return true;
}
catch (directxexception)
{
// catch any errors and return a failure
return false;
}
}
public void oncreatedevice(object sender, eventargs e)
{
device dev = (device)sender;
// now create the vb
vertexbuffer = new vertexbuffer(typeof(customvertex.positionnormaltextured), 100, dev, usage.writeonly, customvertex.positionnormaltextured.format, pool.default);
vertexbuffer.created += new system.eventhandler(this.oncreatevertexbuffer);
this.oncreatevertexbuffer(vertexbuffer, null);
}
public void onresetdevice(object sender, eventargs e)
{
device dev = (device)sender;
// turn off culling, so we see the front and back of the triangle
dev.renderstate.cullmode = cull.none;
// turn off d3d lighting
dev.renderstate.lighting = false;
// turn on the zbuffer
dev.renderstate.zbufferenable = true;
// now create our texture
texture = textureloader.fromfile(dev, application.startuppath + @"/../../banana.bmp");
}
public void oncreatevertexbuffer(object sender, eventargs e)
{
vertexbuffer vb = (vertexbuffer)sender;
// create a vertex buffer (100 customervertex)
customvertex.positionnormaltextured[] verts = (customvertex.positionnormaltextured[])vb.lock(0,0); // lock the buffer (which will return our structs)
for (int i = 0; i < 50; i++)
{
// fill up our structs
float theta = (float)(2 * math.pi * i) / 49;
verts[2 * i].position = new vector3((float)math.sin(theta), -1, (float)math.cos(theta));
verts[2 * i].normal = new vector3((float)math.sin(theta), 0, (float)math.cos(theta));
verts[2 * i].tu = ((float)i)/(50-1);
verts[2 * i].tv = 1.0f;
verts[2 * i + 1].position = new vector3((float)math.sin(theta), 1, (float)math.cos(theta));
verts[2 * i + 1].normal = new vector3((float)math.sin(theta), 0, (float)math.cos(theta));
verts[2 * i + 1].tu = ((float)i)/(50-1);
verts[2 * i + 1].tv = 0.0f;
}
// unlock (and copy) the data
vb.unlock();
}
private void setupmatrices()
{
// for our world matrix, we will just rotate the object about the y-axis.
device.transform.world = matrix.rotationaxis(new vector3((float)math.cos(environment.tickcount / 250.0f),1,(float)math.sin(environment.tickcount / 250.0f)), environment.tickcount / 1000.0f );
// set up our view matrix. a view matrix can be defined given an eye point,
// a point to lookat, and a direction for which way is up. here, we set the
// eye five units back along the z-axis and up three units, look at the
// origin, and define "up" to be in the y-direction.
device.transform.view = matrix.lookatlh( new vector3( 0.0f, 3.0f,-5.0f ), new vector3( 0.0f, 0.0f, 0.0f ), new vector3( 0.0f, 1.0f, 0.0f ) );
// for the projection matrix, we set up a perspective transform (which
// transforms geometry from 3d view space to 2d viewport space, with
// a perspective divide making objects smaller in the distance). to build
// a perpsective transform, we need the field of view (1/4 pi is common),
// the aspect ratio, and the near and far clipping planes (which define at
// what distances geometry should be no longer be rendered).
device.transform.projection = matrix.perspectivefovlh( (float)math.pi / 4.0f, 1.0f, 1.0f, 100.0f );
}
private void render()
{
if (pause)
return;
//clear the backbuffer to a blue color
device.clear(clearflags.target | clearflags.zbuffer, system.drawing.color.blue, 1.0f, 0);
//begin the scene
device.beginscene();
// setup the world, view, and projection matrices
setupmatrices();
// setup our texture. using textures introduces the texture stage states,
// which govern how textures get blended together (in the case of multiple
// textures) and lighting information. in this case, we are modulating
// (blending) our texture with the diffuse color of the vertices.
device.settexture(0,texture);
device.texturestate[0].coloroperation = textureoperation.modulate;
device.texturestate[0].colorargument1 = textureargument.texturecolor;
device.texturestate[0].colorargument2 = textureargument.diffuse;
device.texturestate[0].alphaoperation = textureoperation.disable;
device.setstreamsource(0, vertexbuffer, 0);
device.vertexformat = customvertex.positionnormaltextured.format;
device.drawprimitives(primitivetype.trianglestrip, 0, (4*25)-2);
//end the scene
device.endscene();
// update the screen
device.present();
}
protected override void onpaint(system.windows.forms.painteventargs e)
{
this.render(); // render on painting
}
protected override void onkeypress(system.windows.forms.keypresseventargs e)
{
if ((int)(byte)e.keychar == (int)system.windows.forms.keys.escape)
this.dispose(); // esc was pressed
}
protected override void onresize(system.eventargs e)
{
pause = ((this.windowstate == formwindowstate.minimized) || !this.visible);
}
/// <summary>
/// the main entry point for the application.
/// </summary>
static void
main
()
{
using (textures frm = new textures())
{
if (!frm.initializegraphics()) // initialize direct3d
{
messagebox.show("could not initialize direct3d. this tutorial will exit.");
return;
}
frm.show();
// while the form is still valid, render and process messages
while(frm.created)
{
frm.render();
application.doevents();
}
}
}
}
}
这里还有一个简单的方法处理纹理,其实也差不多,看上去简单一些而已:
tex
= new texture(device, new bitmap(this.gettype(), "puck.bmp"), usage.dynamic, pool.default);
然后在画图的时候用一句
device.settexture(0,
tex
);
就可以把纹理设置到物体上了,不过如果要进行稍微复杂的纹理操作,这个方法就不管用了。
关于纹理的东西还有很多很多,比如纹理的寻址模式,纹理包装,纹理过滤抗锯齿以及alpha混合 和多重纹理等等,这里介绍的只是九牛一毛,不过这些在后面都会慢慢介绍到。
by sssa2000