121
Views
0
CrossRef citations to date
0
Altmetric
Articles

EESRGAN: Efficient & Effective Super-Resolution Generative Adversarial Network

, &
Pages 200-211 | Published online: 19 Jul 2023
 

Abstract

In Taiwan, traditional production equipment for the mainframe panels is imported from overseas, and the parameters are adjusted through the operation panel for automated manufacturing. However, these parameters are slightly different from those used in Taiwan, requiring manual recording of warning parameters to ensure quality control. Currently, the only way to capture the operation records of these machines is by manually recording the system panel information, which is a time-consuming and laborious process that is expensive for production line personnel. In this work, we use image recognition to capture and analyze data externally without damaging the machine. This research improves the ESRGAN network to restore the image details and textures of the mainframe panels. It also captures relevant data with Google OCR to convert panel images into parameter values. The captured results are combined with data analysis to provide more accurate standard mainframe panel information. Even under different interference source conditions, data extraction and analysis can still be carried out to output the text files, effectively assisting the production line in recording the machine panel parameters and reducing personnel’s workload.

DISCLOSURE STATEMENT

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

An-Chao Tsai

An-Chao Tsai (Senior Member, IEEE) received the PhD degree in electrical engineering from National Cheng Kung University, Taiwan in 2010. He became a Member (M) of IEEE in 2015, a Senior Member (SM) in 2020. He is currently working as an associate professor in the International Master Program of Information Technology and Application, National Pingtung University, Pingtung, Taiwan. Prof Tsai also served as the track chair and program chair for IEEE International Conference on Orange technologies since 2015. His research interests include video processing, artificial intelligence, virtual reality and AIoT.

Cheng-Han Tsou

Cheng-Han Tsou received the MS degree from Department of Electrical Engineering, National Cheng Kung University, Taiwan in 2023. He is currently working as a hardware engineer at MediaTek Inc. His research interests include artificial intelligence, machine learning, image processing, image recognition, algorithm design and pattern recognition. Email: [email protected].

Jhing-Fa Wang

Jhing-Fa Wang (Life Fellow, IEEE) is currently a chair and a distinguished professor with the Department of Electrical Engineering, National Cheng Kung University, Taiwan. He has published about 138 papers in IEEE, SIAM, IEICE, and IEE and about 235 conference papers. He developed a Mandarin speech recognition system called Venus-Dictate, known as a pioneering system in Taiwan. He also served as an editor-in-chief of International Journal of Chinese Engineering from 1995 to 2000. He was elected as a Fellow of the IEEE in 1999 for his contribution on “Hardware and Software Co-Design on Speech Signal Processing”. His research interests include speech signal processing, image processing, biomedical signal processing, and VLSI system design. Email: [email protected].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 182.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.