Translate

Friday 24 August 2018

Case Study Of Buckingham Canal Bridge in Ongole


Case Study Of Buckingham Canal Bridge in Ongole
Abstract:
The Buckingham Canal is a 796 kilometers (494.6 mi) long fresh water navigation canal, running parallel to the Coromandel Coast of South India from Kakinada in the East Godavari district of Andhra Pradesh to Villupuram District in Tamil Nadu. The canal connects most of the natural backwaters along the coast to Chennai (Madras) port. It was constructed during British Rule, and was an important waterway during the late nineteenth and early twentieth centuries.
Now coming to the several reasons and weathering conditions the Buckingham canal is lose of its depth and width.then now a days its restoration works are going on the canal.




In location of the ongole to kothapatnam there is bridge to cross the Buckingham canal
It was constructed by the stone masonry in olden days, then it was reconstructed because of the less width and depth so that reason overcome by the construction of new bridge in that shown in below figure. It was first known as the North River by the British and was believed to be partly responsible for reducing cyclone damage to much of the Chennai - southern Andhra coastline.
The canal was used to convey goods up and down the coast from Vijayawada to Madras (now Chennai). The cyclones of 1965/1966 and 1976 damaged the canal, and it is little used and no longer well maintained. Within the city of Chennai the canal is badly polluted from sewage and industrial effluents, and the silting up of the canal has left the water stagnant, creating an attractive habitat for malaria-spreading mosquitoes. The North Chennai Thermal Power Station (NCTP) discharges hot water and fly ash into the canal. In agricultural areas south of Chennai, the former tow path along the scenic areas is used for light motorcycle and bicycle traffic. On 1 January 2001 the Government of India launched a project to prevent sewage discharge into the canal and Chennai's other waterways, and to dredge the canal to remove accumulated sediment and improve water flow.

New construction Bridge:
Ø In that bridge well foundation is taken because of the soil safe bearing capacity is very less

Ø In the foundation wash boring is adopted why because of the cohesive soils are present in that area


Ø The span of the bridge construction is 15m

Ø In the well foundation open caissons are provided


Ø Depth of the foundation is commonly in 5m

Ø The Diameter of the well foundation is 3 feet



Image Analysis


Image Analysis

Learning Objectives: -

By the end of this topic, you will be able to:

Ø  Realize the different elements of visual interpretations required.

Ø  Tell how digital image processing is done.

Ø  Tell what is image preprocessing and enhancement.

Ø  Learn about the image classification.

Ø  Differentiate between supervised classification and unsupervised classification.

Introduction:

à Image analysis is the extraction of meaningful information from images; mainly from digital images by means of digital image processing techniques.

à Many image processing and analysis techniques have been developed to aid the interpretation of remote sensing images and to extract as much information as possible from the images.

à The choice of specific techniques or algorithms to use depends on the goals of each individual project.

Elements of Visual Interpretation:-

There are 8 elements of Visual Interpretation. They are:
1.Tone                      5.Texture
2.Shape                    6.Shadow
3.Size                       7.Size Factor/ Topological Location
4.Pattern                   8.Association

Size:

à Size of objects in an image is a function of scale.

à It is important to assess the size of a target relative to other objects in a scene, as well as the absolute size, to aid in the interpretation of that target.

à A quick approximation of target size can direct interpretation to an appropriate result more quickly.

à For example, if an interpreter had to distinguish zones of land use, and had identified an area with a number of buildings in it, large buildings such as factories or warehouses would suggest commercial property, whereas small buildings would indicate residential use.

Shape:

1. Shape refers to the general form, structure, or outline of individual objects.
2. Shape can be a very distinctive clue for interpretation.
3. Straight edge shapes typically represent urban or    agricultural (field) targets, while natural features, such        as forest edges, are generally more irregular in shape, except where man has created a road or clear cuts.

4. Farm or crop land irrigated by rotating sprinkler systems would appear as circular shapes.

Shadow:-

1. Shadow is also helpful in interpretation as it may provide an idea of the profile and relative height of a target or targets which may make identification easier

2.However, shadows can also reduce or eliminate interpretation in their area of influence, since targets within shadows are much less (or not at all) discernible from their surroundings

3.Shadow is also useful for enhancing or identifying topography and landforms, particularly in radar imagery

Tone:-

1. Tone refers to the relative brightness or colour of objects in an image.

2. Generally, tone is the fundamental element for distinguishing between different targets or features.

3.Variations in tone also allow the elements of shape, texture, and pattern of objects to be distinguished.

Texture:

1.Texture refers to the arrangement and frequency of tonal variation in particular areas of an image.

2.Rough textures would consist of a mottled tone where the grey levels change abruptly in a small area, whereas smooth textures would have very little tonal variation.

3.Smooth textures are most often the result of uniform, even surfaces, such as fields, asphalt, or grasslands.

4.A target with a rough surface and irregular structure, such as a forest canopy,
results in a rough textured appearance.

5.Texture is one of the most important elements for distinguishing features in radar imagery.

Pattern: -

1.   Pattern refers to the spatial arrangement of visibly discernible objects.

2.      Typically, an orderly repetition of similar tones and textures will produce a distinctive and ultimately   recognizable pattern.

  3. Orchards with evenly spaced trees, and urban streets with regularly spaced houses are good examples of pattern.

Site Factor/Topological Location:-

1. Relative elevation or specific location of objects can be helpful to identify certain features

2. For example, sudden appearance or disappearance of vegetation is a good clue to
the underlying soil type or drainage conditions

Association:

1. Association takes into account the relationship between other recognizable objects or features
in proximity to the target of interest.

2. The identification of features that one would expect to associate with other features may
provide information to facilitate identification.

3. For example, commercial properties may be associated with proximity to major transportation routes, whereas residential areas would be associated with schools, playgrounds, and sports fields.


Digital Image Processing:-

Digital Image Processing (DIP) is a technique which involves manipulation of digital image to extract information. When satellite images are being manipulated in such a manner, this technique is also referred to as satellite image processing.

It involves combination of software-based image processing tools. The whole process of Digital Image Processing can be classified into three parts:

1) Digital Image Pre-Processing
2) Digital Image Enhancement
3) Digital Image classification

Geometric Correction Methods:-

1. The information extracted from remotely sensed images is integrated with map data in a geographical information system.

2.The transformation of a remotely sensed image into a map with a scale and projection properties is called geometric correction.

3. To correct sensor data, both internal and external errors must be determined.

4. Geometric rectification of the imagery resamples or changes the pixel grid to fit that of a map projection or another reference image.

5. This becomes especially important when scene to scene comparisons of individual pixels in applications such as change detection are being sought.

           

1. The primary function of remote sensing data quality evaluation is to monitor the performance of the sensors
2.The performance of the sensors is continuously monitored by applying radiometric correction models on digital image data sets
3.Radiometric Correction Methods include:

1. Computation of Radiance (L)
2. Computation of Reflectance
3. Cosmetic Operation
4. Random Noise Removal

1.Computation of Radiance:-

1.Radiance is a measure of the radiant energy given out by an object and picked up by a remote sensor.
2.Spectral radiance (L) is defined as the energy within a wavelength band radiated by a unit area pre solid angle of measurement.

Radiance (L.) = (Dn/D max) (Lmax -Lmin) +Lmax
Dn      = digital value of a pixel from the computer-compatible tape (CCT)
Dmax = maximum digital number recorded on the CCT
Lmux    = maximum radiance measured at detector saturation in mW
 

Lmin = minimum radiance measured at detector saturation in mW
  ,

2.Computation of Radiance:-

Reflectance is an energy ratio which is a function of radiance
Reflectance =
(Radiance)/ E sina

3.Cosmetic Operations:-

1.The corrections involved in the cosmetic operations are the correction of digital images containing either partially or entirely missing scan lines and the correction of images because of destripping of imagery,

2. Due to the variation in the sensitivity of the detectors, the irradiance of the object may differ.

3. The line dropout or missing scan line is usually overcome by replacing the zero value by the mean values of the pixels of the previous and the following line.

4.Random Noise Removal:-

1. Image noise is any unwanted disturbance in image data that is due to limitations in the sensing and data recording process.

2. The random noise problems referred to as spiky are characterized by non-systematic variations in gray levels from pixel to pixel.

Atmospheric Correction Methods:-

1.According to Rayleigh scattering, the effect of scattering is inversely proportional to the fourth power of the energy.

2. The bias is the amount of offset for each spectral band.

3.Bias can be determined by regressing the visible band vs infrared bands
4.To correct the scattering, firstly identify some areas and then the brightness values of all these features from each band is extracted.

5.The gray value of the visible band is plotted against corresponding values at the same pixel location in the infrared band.

6. The plot will result in a scatter diagram, following the regression analysis
regression line is to be fitted.

7. If the data is free from atmospheric scattering, the best fitting line should pass
through the origin.

                 
Image Enhancement:-

1.Low sensitivity of the detectors, weak signal of the objects present on the earth surface, similar reflectance of different objects and environmental conditions at the time of recording are the major causes of low contrast of the image.

2.Image enhancement is done to amplify these slight differences for betterment of the image scene.

3.Image enhancement is defined as mathematical operations that are to be applied to digital remote input data to improve the visual appearance of an image for better interpretability or subsequent digital analysis.

4.Contrast stretching

5.Spatial filtering

6.Edge enhancement

7.Linear data transformations

8.Stretch the contrast to enhance

9.Features interested

10.Linear stretch (with or without saturation)

11.Sinusoidal or sine

12.Non-Linear

For example, equalization, standard deviation, etc.

                  

                  

Spatial Filtering:-

1.The neighborhood
2.The image profile
3.Numerical filters
4.low-pass filters
5.high-pass filters


1.The neighborhood: -
2.The image profile: -
                                     
3.Numerical filters: -
                             
4.low-pass filters: -
                                   



5.high-pass filters:-
                                           
High and Low Frequency Changes: -
                             
6.Edge enhancement:
                    

Linear Data Transformation: -

1. The individual bands are often observed to be highly correlated or redundant.

2. Two mathematical transformation techniques are often used to minimize this spectral redundancy

3. Principal component analysis (PCA)

4. u canonical analysis (CA)

Principal Components Analysis:-

1. Compute a set of new, transformed variables (components), with each component largely independent of the others (uncorrelated).

2. The components represent a set of mutually orthogonal and independent axes in a n-dimensional space.

3. The first new axis contains the highest percentage of the total variance or scatter.

                      




PC Images: -

1. The digital enhancement techniques available in remote sensing are contrast stretching enhancement, rationing, linear combinations, principal component analysis and spatial filtering.

2. The image enhancement techniques are broadly classified as point operations and local operations.

3. Point operations modify the values of each pixel in the image dataset independently.

4.Local operations modify the values of each pixel in the context of the pixel value surrounding it.

5. In Linear contrast stretch, the digital number (DN) value in the lower end of the original histogram is assigned to digital number zero i.e., extremely black and value at the higher end is assigned to extremely white.

6. The intermediate values are interpolated between 0 and 225 by following a linear
relationship Y = a + bx

7.Exponential contrast enhancement is also considered as a non-linear contrast enhancement.

8. The grey values in the input image transform to the gray values in the output image

9. Historical equalization is widely used for contrast manipulation in digital image processing because it is very simple for implementation.

10. Here, the original histogram has been readjusted to produce a uniform population density of pixels along the horizontal grey value axis.

Image Classification: -


                
1. Image classification is the process of creating a meaningful digital.

2. Thematic map from a image data set information extraction).


֍  Supervised classification: Classes from known cover types.
֍  Unsupervised classification: Classes by algorithms that search the data for similar pixels.

The process of image classification: -

                    
                         
Supervised Classification: -

1.Classification methods that relay on use of training patterns are called supervised classification methods.

2.A supervised classification algorithm requires a training sample for each class.

3.The training samples are representative of the known classes of interest to the analyst.

Training stage: The analyst identifies representative training areas and develops numerical descriptions of the spectral signatures of each land cover type of interest in the scene.

The classification stage: Each pixel in the image data set is categorized into the land cover class it mostly resembles. If the pixel is insufficiently similar to any training data set it is usually labelled 'unknown'.

The output stage: Three typical forms of output products are thematic maps, tables and digital data files. The output of image classification becomes input data for GIS for spatial analysis of the terrain.


1.Training class selection (training areas/classes)
2.Generating statistical parameters (spectral signatures) of training classes
3.Data classification
4.Evaluation and refinement

Supervised Spectral Classification:

Common Classifiers:
1. Parallelepiped
2. Minimum distance to mean
3. Maximum likelihood
4. Parallelepiped Approach
5. Simple method
6. Makes few assumptions about character of the classes
7. Minimum distance to mean
8. Find mean value of pixels of training sets in n-dimensional space
9. All pixels in image classified according to the class mean to which they are closest
10. Maximum likelihood
11. All regions of n-dimensional space are classified

Clustering: -
Clustering Process: -

Classification Procedures: -


Case Study Of Buckingham Canal Bridge in Ongole

Case Study Of Buckingham Canal Bridge in Ongole Abstract: The  Buckingham Canal  is a 796 kilometers (494.6 mi) long  fresh water nav...