Personal tools

Image Processing Research and Applications

Image Processing System_032223A
[Image Processing System - JavaTPoint]
 

- Overview

In the past few years, deep learning has had a huge impact on various technical fields. One of the hottest topics in the industry is computer vision, or the ability for computers to understand images and video on their own. Self-driving cars, biometrics, and facial recognition all rely on computer vision to work. At the heart of computer vision is image processing.

Image processing is a technique that enhances raw images received from cameras/sensors placed on satellites, space probes, and aircraft, or photos taken in everyday life for various applications.

There are two methods of image processing: analog image processing and digital image processing. Analog Image Processing refers to the alteration of image through electrical means. The most common example is the television image. In this case, digital computers are used to process the image. The image will be converted to digital form using a scanner – digitizer and then process it.

 

- Images

Before we get into image processing, we need to understand what exactly an image is made of. 

In fact, every scene around us forms an image, which involves image processing. Images are formed from two-dimensional analog and digital signals containing color information arranged along the x and y spatial axes. 

Images are represented by pixel-based dimensions (height and width). For example, if an image has dimensions 500 x 400 (width x height), the total number of pixels in the image is 200,000.

The pixel is a point on the image that has a specific shade, opacity, or color. It is usually represented by one of the following:

  • Grayscale - A pixel is an integer with a value between 0 and 255 (0 is completely black and 255 is completely white).
  • RGB - A pixel consists of 3 integers between 0 and 255 (the integers represent the intensity of red, green, and blue).
  • RGBA - It is an extension of RGB with the addition of an alpha field, representing the opacity of the image.

Image processing requires a fixed sequence of operations to be performed on each pixel of the image. The image processor performs a first sequence of operations on the image on a pixel-by-pixel basis. When it's done, it starts the second, and so on. The output values of these operations can be computed at any pixel of the image. 

 

- Image Processing

Image processing is the process of converting an image into digital form and performing certain operations to obtain some useful information from it. Image processing systems generally treat all images as two-dimensional signals when applying certain predetermined signal processing methods.

There are five main types of image processing:

  • Visualization – Observe the objects that are not visible. 
  • Image sharpening and restoration – To create a better image. 
  • Image retrieval – Seek for the image of interest. 
  • Measurement of pattern – Measures various objects in an image. 
  • Image Recognition – Distinguish the objects in an image.

 

- Analog and Digital Signals

Analog and digital signals are types of signals that carry information. The key difference between the two signals is that the analog signal has a continuous electrical signal whereas the digital signal has a discontinuous electrical signal. The difference between analog and digital signals can be observed with the examples of different types of waves.

Analog signals are used in many systems to generate signals to carry information. These signals are continuous in value and time. The use of analog signals has decreased with the advent of digital signals. In short, understand analog signals - all natural or naturally occurring signals are analog signals.

Unlike analog signals, digital signals are not continuous but discrete in value and time. These signals are represented by binary numbers and consist of different voltage values.

Digital signal processing is about processing analog or real-world signals for human interaction, such as speech. DSP systems use converters to convert digital signals to analog and vice versa.

Digital image processing is a special type of processor, which is used in various electronic devices such as CDs, mobile phones, battlefields, satellites, medical treatment, and voice detection machine 

 

- Digital Images and Signals

Images are two-dimensional arrays where color information is arranged along the x and y spatial axes. So, to understand how images are formed, we should first understand how signals are formed?

Signaling is a mathematical and statistical way of connecting us to the physical world. It can be measured by the dimensions of space and time. Signals are used to convey information from one source to another.

Signals can be measured on 1D or 2D arrays or higher. Common examples are sound, image and sensor output signals.

Here, the one-dimensional signal is measured in space and time, and the two-dimensional signal is measured in some other physical quantity, such as a digital image.

A signal is something that communicates information around us in the physical world, it can be any sound, image, etc. Whatever we say, it will first be converted into a signal or wave, and then delivered to others at the proper time. When an image is captured in a digital camera, the signal is transferred from one system to another.

 

- Computer Vision and Image Processing 

The human eye has 6 to 7 million cone cells, which contain one of three color-sensitive proteins called opsins. When photons hit these opsins, they change shape, triggering a cascade of electrical signals that transmit information to the brain for interpretation.

The whole process is a very complex phenomenon, and it has been a challenge for machines to explain it at a human level. The motivation behind modern machine vision systems lies at the core of simulating human vision to recognize patterns, faces and render 2D images from a 3D world into 3D.

On a conceptual level, there is a lot of overlap between image processing and computer vision, and often misunderstood jargon is used interchangeably. Computer vision comes from modeling image processing using machine learning techniques. Computer vision applies machine learning to identify patterns in image interpretation. Much like the visual reasoning process of human vision; we can distinguish objects, classify them, classify them according to their size, and so on. Computer vision, like image processing, takes an image as input and gives an output in the form of information such as size, color intensity, etc.

Image processing is a subset of computer vision. Computer vision systems use image processing algorithms to attempt to perform visual simulations on a human scale. For example, if the goal is to enhance an image for later use, then this can be called image processing. If the goal is to recognize objects, defects of autonomous driving, then it can be called computer vision.

 

- Analog and Digital Image Processing

There are two types of methods used for image processing namely, analog and digital image processing. 

  • Analog image processing is applied to analog signals and it only deals with two-dimensional signals. Images are manipulated by electrical signals. In analog image processing, analog signals can be either periodic or non-periodic. Examples of simulated images are television images, photographs, paintings, and medical images.
  • Digital image processing is applied to digital images (matrixes of small pixels and elements). In order to manipulate images, there are many software and algorithms available to perform the changes. Digital image processing is one of the fastest growing industries affecting everyone's life. Examples of digital images are color processing, image recognition, video processing, etc.

 

Murren_Switzerland_012921A
[Murren, Switzerland - Christophe Cosset]

- Analog Image Processing vs. Digital Image Processing

There are following differences between Analog Image Processing and Digital Image Processing:
  • Analog image processing is applied to analog signals and it only deals with two-dimensional signals. Digital image processing is applied to the analysis and processing of digital signals of images.
  • The analog signal is a time-varying signal, so the image formed under analog image processing changes. It improves the digital quality of the image and the intensity distribution is perfect in it.
  • Analog image processing is a slower and more expensive process. Digital image processing is a lower cost, faster image storage and retrieval process.
  • Analog signals are images of the real world but in poor quality. It uses good image compression techniques to reduce the amount of data required and produce high quality images
  • It is usually continuous and not broken down into tiny parts. It uses an image segmentation technique that detects discontinuities that occur due to broken connecting paths.

 

- AI Image Processing

AI image processing is the process or application of artificial intelligence algorithms to understand, interpret, and manipulate visual data or images. This also involves analyzing and enhancing image quality to extract information. 

Essentially, the core functions of AI image processing such as image recognition, segmentation, and enhancement allow various systems to identify, understand, and classify images from a wide database.

 

- Research Topics in Digital Image Processing (DIP)

  • Analog Image vs Digital Image
  • AI Image Processing
  • Digital Image and Signal
  • Signal and System
  • Analog signals vs Digital signals
  • Continuous Systems vs Discrete Systems
  • History of Photography
  • Portable Cameras vs Digital Cameras
  • DIP Applications
  • Concept of Dimensions
  • Image Formation on Camera
  • Camera Mechanism
  • Concept of Pixel
  • Perspective Transformation
  • Concept of Bits Per Pixel
  • Types of Images
  • Color Codes Conversion
  • Grayscale to RGB Conversion
  • Concept of Sampling
  • Pixels, Dots and Lines per Inch
  • DIP Resolution
  • Quantization Concept
  • Dithering Concept
  • DIP Histograms
  • Brightness & Contrast
  • Image Transformation
  • Gray Level Transformation
  • Concept of Convolution
  • Concept of Mask
  • Robinson compass mask
  • Krisch Compass Mask
  • Concept of Blurring
  • Concept of Edge Detection
  • Frequency domain Introduction
  • High Pass vs Low Pass Filters
  • Color Spaces IntroductionJ
  • PEG Compression
  • Computer Vision vs Computer Graphics
 

 

[More to come ...] 

 
Document Actions