AN ENHANCED LIGHTWEIGHT TRANSFORMER METHOD USING STRONG ENCODER TECHNIQUES IN UNDERWATER OBJECT DETECTION
Abstract
vioft2nntf2t|tblJournal|Abstract_paper|0xf4ffd39031000000814d110001000200
The two-stage lightweight transformer methodology can now be used for underwater object detection thanks to some small adjustments that have been made to it. These adjustments were made to accommodate the needs of the underwater environment. By utilising multi-scale training, making enhancements to the backbone network of the Faster RCNN, and optimising the model response to both positive and negative examples, with an emphasis on the latter, we can achieve significant gains. The enhanced lightweight transformer method has been shown to be effective through the utilisation of comparative experiments which demonstrate that the network module may be utilised in the method to serve as the feature extraction structure. This has been shown to be the case thanks to the fact that the enhanced lightweight transformer method has been shown to be effective. We first disassemble the entire system to break it down into its component components before running the detection algorithm and the ablation tests. We can perform additional tests regarding the efficiency of the object detection algorithm for lightweight transformers. When compared to the unimproved lightweight transformer technique, the F1 result is rapidly approaching 99%, which represents a substantial leap forward in terms of quality. This contributes to demonstrating that the methods that were recommended are effective.

Authors
N. Sivakumar1, A. Sumalatha2, K. Prabhu3
Varuvan Vadivelan Institute of Technology, India1, Kristu Jayanti College, India2, GRT Institute of Engineering and Technology, India3

Keywords
Lightweight Transformer, Strong Encoder, Underwater Object Detection
Yearly Full Views
JanuaryFebruaryMarchAprilMayJuneJulyAugustSeptemberOctoberNovemberDecember
000000000000
Published By :
ICTACT
Published In :
ICTACT Journal on Image and Video Processing
( Volume: 13 , Issue: 3 , Pages: 2928 - 2933 )
Date of Publication :
Feburay 2023
Page Views :
278
Full Text Views :
1

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.