mapbox源码解析(mvp矩阵)

mapbox源码解析(mvp矩阵)

由于gl-matrix 矩阵的计算规则(因此js代码中看计算顺序是倒序,我们看的话从右看到左,先模型矩阵,再视图矩阵,再投影矩阵)

pvm *position(范围:0-EXTENT)

源码路径:src /geo/transform.js

1.坐标范围

原始单位切片的坐标范围为0-8192,主要看瓦片生产时候的规范吧

2.模型矩阵,转世界坐标

2.1计算转当前切片内的真正相对坐标(本地坐标)

 mat4.scale(posMatrix, posMatrix, [scale / EXTENT, scale / EXTENT, 1]);
 其中EXTENT=8192,
 其中
 scale = this.worldSize / this.zoomScale(canonical.z)
 =this.tileSize * Math.pow(2, zoom) /  Math.pow(2, Math.floor(zoom))
 //一般计算结果为512-1024,主要是在 Math.floor(zoom)-zoom切换的时候,瓦片大小从512变化到1024
 //注意此处zoom可以是小数;

2.2计算转当前切片内各点平移后的真正绝对坐标(世界坐标),得到模型矩阵ModelMatrix

mat4.translate(posMatrix, posMatrix, [unwrappedX * scale, canonical.y * scale, 0]); 
//unwrappedX 表示切片行号
//canonical.y表示切片列号

这里计算得到的瓦片的xy坐标已经是世界坐标下的绝对坐标(世界宽度当然是动态改变的,this.worldSize = Math.pow(2,zoom(可能带小数)* this.tileSize)

3.视图转换,转到相机空间坐标下ViewMatrix

 mat4.scale(m, m, [1, 1, mercatorZfromAltitude(1, this.center.lat) * this.worldSize, 1]); //因为上面xy已经转为世界坐标单位了,这里对z拉伸,也将其转为世界坐标单位,这里也可以归类为模型变化其实
 mat4.translate(m, m, [-x, -y, 0]);//从左上角移动到屏幕中心点的世界坐标
 mat4.rotateZ(m, m, this.angle);
 mat4.rotateX(m, m, this._pitch);
 mat4.translate(m, m, [0, 0, -this.cameraToCenterDistance]); //相机到世界中心点的距离,平移过去
 mat4.scale(m, m, [1, -1, 1]);//y轴转方向 

备注:
mapbox customerLayer里面的u_marirx是src/geo/transform.js 中的this.mercatorMatrix = mat4.scale([], m, [this.worldSize, this.worldSize, this.worldSize]);矩阵,

4.投影转换

mat4.perspective(m, this._fov, this.width / this.height, nearZ, farZ);

5.src/geo/transform.js相关矩阵计算源码备注

版本:v1.13.1中transform源码

5.1 视图矩阵和投影矩阵相关计算(CustomerLayer比较特殊)

_calcMatrices() {
        if (!this.height) return;

        const halfFov = this._fov / 2;
        const offset = this.centerOffset;
        this.cameraToCenterDistance = 0.5 / Math.tan(halfFov) * this.height;

        // Find the distance from the center point [width/2 + offset.x, height/2 + offset.y] to the
        // center top point [width/2 + offset.x, 0] in Z units, using the law of sines.
        // 1 Z unit is equivalent to 1 horizontal px at the center of the map
        // (the distance between[width/2, height/2] and [width/2 + 1, height/2])
        const groundAngle = Math.PI / 2 + this._pitch;
        const fovAboveCenter = this._fov * (0.5 + offset.y / this.height);
        const topHalfSurfaceDistance = Math.sin(fovAboveCenter) * this.cameraToCenterDistance / Math.sin(clamp(Math.PI - groundAngle - fovAboveCenter, 0.01, Math.PI - 0.01));
        const point = this.point;
        const x = point.x, y = point.y;

        // Calculate z distance of the farthest fragment that should be rendered.
        const furthestDistance = Math.cos(Math.PI / 2 - this._pitch) * topHalfSurfaceDistance + this.cameraToCenterDistance;
        // Add a bit extra to avoid precision problems when a fragment's distance is exactly `furthestDistance`
        const farZ = furthestDistance * 1.01;

        // The larger the value of nearZ is
        // - the more depth precision is available for features (good)
        // - clipping starts appearing sooner when the camera is close to 3d features (bad)
        //
        // Smaller values worked well for mapbox-gl-js but deckgl was encountering precision issues
        // when rendering it's layers using custom layers. This value was experimentally chosen and
        // seems to solve z-fighting issues in deckgl while not clipping buildings too close to the camera.
        const nearZ = this.height / 50;

        // matrix for conversion from location to GL coordinates (-1 .. 1)
        let m = new Float64Array(16);
        mat4.perspective(m, this._fov, this.width / this.height, nearZ, farZ);

        //Apply center of perspective offset
        m[8] = -offset.x * 2 / this.width;
        m[9] = offset.y * 2 / this.height;

        mat4.scale(m, m, [1, -1, 1]);
        mat4.translate(m, m, [0, 0, -this.cameraToCenterDistance]);
        mat4.rotateX(m, m, this._pitch);
        mat4.rotateZ(m, m, this.angle);
        mat4.translate(m, m, [-x, -y, 0]);

        // The mercatorMatrix can be used to transform points from mercator coordinates
        // ([0, 0] nw, [1, 1] se) to GL coordinates.
        this.mercatorMatrix = mat4.scale([], m, [this.worldSize, this.worldSize, this.worldSize]);

        // scale vertically to meters per pixel (inverse of ground resolution):
        mat4.scale(m, m, [1, 1, mercatorZfromAltitude(1, this.center.lat) * this.worldSize, 1]);

        this.projMatrix = m;
        this.invProjMatrix = mat4.invert([], this.projMatrix);

        // Make a second projection matrix that is aligned to a pixel grid for rendering raster tiles.
        // We're rounding the (floating point) x/y values to achieve to avoid rendering raster images to fractional
        // coordinates. Additionally, we adjust by half a pixel in either direction in case that viewport dimension
        // is an odd integer to preserve rendering to the pixel grid. We're rotating this shift based on the angle
        // of the transformation so that 0°, 90°, 180°, and 270° rasters are crisp, and adjust the shift so that
        // it is always <= 0.5 pixels.
        const xShift = (this.width % 2) / 2, yShift = (this.height % 2) / 2,
            angleCos = Math.cos(this.angle), angleSin = Math.sin(this.angle),
            dx = x - Math.round(x) + angleCos * xShift + angleSin * yShift,
            dy = y - Math.round(y) + angleCos * yShift + angleSin * xShift;
        const alignedM = new Float64Array(m);
        mat4.translate(alignedM, alignedM, [ dx > 0.5 ? dx - 1 : dx, dy > 0.5 ? dy - 1 : dy, 0 ]);
        this.alignedProjMatrix = alignedM;

        m = mat4.create();
        mat4.scale(m, m, [this.width / 2, -this.height / 2, 1]);
        mat4.translate(m, m, [1, -1, 0]);
        this.labelPlaneMatrix = m;

        m = mat4.create();
        mat4.scale(m, m, [1, -1, 1]);
        mat4.translate(m, m, [-1, -1, 0]);
        mat4.scale(m, m, [2 / this.width, 2 / this.height, 1]);
        this.glCoordMatrix = m;

        // matrix for conversion from location to screen coordinates
        this.pixelMatrix = mat4.multiply(new Float64Array(16), this.labelPlaneMatrix, this.projMatrix);

        // inverse matrix for conversion from screen coordinaes to location
        m = mat4.invert(new Float64Array(16), this.pixelMatrix);
        if (!m) throw new Error("failed to invert matrix");
        this.pixelMatrixInverse = m;

        this._posMatrixCache = {};
        this._alignedPosMatrixCache = {};
    }

5.2 模型矩阵

calculatePosMatrix(unwrappedTileID: UnwrappedTileID, aligned: boolean = false): Float32Array {
        const posMatrixKey = unwrappedTileID.key;
        const cache = aligned ? this._alignedPosMatrixCache : this._posMatrixCache;
        if (cache[posMatrixKey]) {
            return cache[posMatrixKey];
        }

        const canonical = unwrappedTileID.canonical;
        const scale = this.worldSize / this.zoomScale(canonical.z);
        const unwrappedX = canonical.x + Math.pow(2, canonical.z) * unwrappedTileID.wrap;

        const posMatrix = mat4.identity(new Float64Array(16));
        mat4.translate(posMatrix, posMatrix, [unwrappedX * scale, canonical.y * scale, 0]);
        mat4.scale(posMatrix, posMatrix, [scale / EXTENT, scale / EXTENT, 1]);
        mat4.multiply(posMatrix, aligned ? this.alignedProjMatrix : this.projMatrix, posMatrix);

        cache[posMatrixKey] = new Float32Array(posMatrix);
        return cache[posMatrixKey];
    }
上一篇:c – 如何计算ELF文件中的静态初始化程序?


下一篇:截图函数