首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9篇
  免费   0篇
测绘学   9篇
  2013年   9篇
排序方式: 共有9条查询结果,搜索用时 675 毫秒
1
1.
Mapping spatial change is a fundamental theme in geography. The analytical and numerical application of differential calculus to continuous geographic data produces first-derivative distributions that can be mapped to show gradient magnitude and gradient direction, and second-derivative measures that can be mapped to show the form (convexity, concavity) of the geographic surface. When these differential measures are obtained for spatially distributed temporal data, a velocity/acceleration change map can be constructed. Cartographic applications of the methodology presented in this paper include slope and curvature landform mapping, derivative trend-surface mapping of urban housing value gradients and the velocity/acceleration mapping of mobile-home residency in the United States from 1950 to 1980.  相似文献   
2.
The academic discipline of cartography is a twentieth-century phenomenon. From its incipient roots in landscape representation in geology and the mapping of socio-economic data in geography, it grew into its own sub-discipline with graduate programs, research paradigms, and a scientific literature of its own. It came close to establishing a national center for cartography In the late 1960s. After rather sporadic activity before World War II, the period from 1946 to 1986 saw the building of major graduate programs at the universities of Wisconsin, Kansas, and Washington. Other programs were created, often with the doctoral students from those three. At the end of the twentieth century, cartography underwent significant changes in relation to the emerging discipline of geographic information science. The future for academic cartography is less certain, as graduate programs adjust the balances among the many components of mapping science, including cartography, geovisualization, GI science, GIS systems, spatial analysis/statistics, and remote sensing.  相似文献   
3.
Over the last three decades analytical cartography has grown from Tobler's concept of "solving cartographic problems" into a broader and deeper scientific specialization that includes the development and expansion of analytical/mathematical spatial theory and model building. In many instances Tobler himself has led the way to these new insights and developments. Fundamental concepts begin with Tobler's cartographic transformations; Nyerges' deep and surface structure and data levels; and Moellering's real and virtual maps; the sampling theorem; and concepts of spatial primitives and objects. This list can be expanded to include additional analytical concepts such as spatial frequencies, spatial surface neighborhood operators, information theory, fractals, Fourier theory, topological network theory, and analytical visualization, to name a few. This base of analytical theory can be employed to analyze and/or develop such things as spatial surfaces, terrain analysis, spatial data schemas, spatial data structures, spatial query languages, spatial overlay and partitioning, shape analysis, surface generalization, cartographic generalization, and analytical visualization. More analytical uses of theory, strategies of analysis, and implementations are being developed and continue to multiply as the field continues to grow and mature. A primary goal is to expand the mathematical/analytical theory of spatial data analysis, and theory building and analytical visualization as analytical cartography takes its place in the geographic information sciences. The research future for this area appears very bright indeed.  相似文献   
4.
We define as Positional Accuracy Improvement the problem of putting together maps A and B of the same area, with B of higher planimetric accuracy. To do so, all objects in A might have to be slightly moved according to a mathematical transformation. Such transformation might ideally be of a specific type, like analytical or conformal functions. We have developed a theory to find a suitable analytical transformation despite it is not well defined because the only data available is the displacement vectors at a limited number of homologue control points. There exists a similar problem in fluid mechanics devoted on estimating the complete velocity field given just values at a limited number of points. We borrowed some ideas from there and introduced them into the positional accuracy improvement problem. We shall demonstrate that it is possible to numerically estimate an analytic function that resembles the given displacement at control points. As a byproduct, an uncertainty estimation is produced, which might help to detect regions of different lineage. The theory has been applied to rural 1:50.000 cartography of Uruguay while trying to diminish the discrepancies against GNSS readings. After the analytic transformation, the RMSE error diminished from 116 m to 48 m. Other problems with similar math requirements are the transformation between geodetic control networks.  相似文献   
5.
Statistical maps are now produced in greater numbers and by persons in a wider range of disciplines than ever before. It is unfortunate that this increased production has not been accompanied by an improvement in map quality. All too frequently, contemporary mapmakers evidence a lack of understanding of the primary function of statistical maps, the symbolic language of mapping, and the effect that data manipulation plays in map communication. Coupled with these deficiencies there is a general lack of appreciation of the basic elements of graphic design. Furthermore, the expanded use of the computer in mapmaking seems to be related to and may even foster low quality since programmers are generally untutored in cartographic design and communication.  相似文献   
6.
For more than a decade, efforts to develop and specify the U.S. Spatial Data Transfer Standard (SDTS) have on many occasions encountered limitations in both theory and "gaps in our knowledge" which have hindered its development. This work examines broad categories of these limitations from the perspective of research needs, to encourage further research on these topics. Areas in need of further study include fundamental concepts, the specification and use of spatial objects, spatial data quality, entity definitions, the data transfer mechanism, and international comparison of transfer mechanisms. In many cases recent research progress has been made in these areas and this progress is pointed out. A number of high-priority research areas are identified. It is hoped that this work will encourage more research effort to be directed towards these areas, which will benefit not only the development of spatial data transfer standards but also the spatial data sciences in general.  相似文献   
7.
This paper examines the development of analytical cartography and the contributions Waldo Tobler has made to it, starting well before his definition of the subject in 1976. Analytical cartography's roots in World War II and the Cold War are examined, and the influences and precedents for the academic course that Tobler described are discussed. The systems of knowledge production developed for analytical cartography in its social context are summarized and are found to show a powerful dependence on a working relationship between academia, industry, government, and the intelligence mapping community. Current research trends in analytical cartography, including the organization of research, its institutions, and its priorities, are discussed, and it is proposed that declassifying the "missing pool" of analytical cartographic research literature could be of great benefit in the future. The four-way academic/industrial/government/intelligence partnership is seen as an opportune direction forward for analytical cartography. The next generational shift in the center of the discipline may occur in networks that even Waldo Tobler did not anticipate.  相似文献   
8.
A great convergence of cartography, secrecy, and power occurred during the Cold War. In the American case, a complex series of interactions between secret and classified programs and institutions and their publicly accessible counterparts accomplished both traditional and novel objectives of military geographic intelligence. This process also yielded the World Geodetic System, a mass-centered "figure of the earth" at accuracies adequate for warfare with intercontinental ballistic missiles. A structural and institutional separation developed between enterprises charged with overhead data acquisition systems, which were classified at increasingly high levels of secrecy, and those responsible for data reduction, analysis, and mapping systems, which remained largely unclassified and publicly accessible, in part to conceal the classified data acquisition systems. This structural separation destabilized photogrammetric mapping by displacing systems that privileged dimensional stability with systems that privileged novel sensor types more appropriate to Cold War geo-political objectives and constraints. Eventually, photogrammetric mapping systems were re-stabilized by successfully implementing analytical solutions imposed in digital mapping and data management systems. This achievement re-privileged dimensional stability, now redefined to the new media of geo-referenced digital data. In the early 1970s these developments culminated in advanced research projects of Military Geographic Intelligence Systems (MGIS). Their deployment in the Vietnam War was both their apex and their undoing. In the aftermath, classified mapping and database systems diverged from civilian versions of MGIS, which became known as Geographic Information Systems (GIS).  相似文献   
9.
An important issue in cartography and GIS is determining the appropriate resolution or cell size when converting vector data to raster. The general consensus is to make the cell size as small as possible to resolve geographic features and provide the most accurate estimates of measurements. Finer resolution results in more accurate estimates of polygon area; however, the raster data structure introduces an artifact that causes errors in the estimation of the length of linear features and of the perimeter of polygon features to increase with increasing resolution. Over-estimation as high as 41 percent is theoretically possible and was found to be around 26 percent for representative polygon maps. A method is described that uses a correction coefficient to reduce overestimation error to less than 3 percent.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号