30
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Experimental Analysis of Style Transfer and Target Detection in Interactive Art on Smartphones

REFERENCES

  • S. Du, “Research on the UI design of mobile Internet software products,” Revista de la Facultad de Ingeniería, Vol. 32, no. 5, 2017.
  • X. Fan, “Application of style transfer algorithm in the design of animation pattern special effect,” J. Phys: Conf. Ser., Vol. 1852, no. 2, pp. 022054, 2021, April.
  • E. P. Bogucka, and L. Meng, “Projecting emotions from artworks to maps using neural style transfer,” in Proceedings of the ICA (Vol. 2, pp. 1–8) (2019, July). Copernicus GmbH.
  • R. Wang, “Research on image generation and style transfer algorithm based on deep learning,” Open Journal of Applied Sciences, Vol. 9, no. 8, pp. 661–672, 2019.
  • Z. Lin, et al., “Image style transfer algorithm based on semantic segmentation,” IEEE. Access., Vol. 9, pp. 54518–54529, 2021.
  • C. Y. Shih, Y. H. Chen, and T. Y. Lee, “Map art style transfer with multi-stage framework,” Multimed. Tools. Appl., Vol. 80, pp. 4279–4293, 2021.
  • J. C. dos Reis, R. Bonacin, C. J. Jensen, H. H. Hornung, and M. C. C. Baranauskas, “Design of interactive mechanisms to support the communication of users’ intentions,” Interact. Comput., Vol. 30, no. 4, pp. 315–335, 2018.
  • J. Dong, and Z. Huang, “Design of art interactive teaching system based on multiple intelligence theory,” Int. J. Inf. Commun. Technol., Vol. 16, no. 3, pp. 191–202, 2020.
  • J. Guo, and L. Wang. Application of Style Transfer Algorithm in Interactive Art Design of Mobile Phone Interface. Mobile Information Systems, 2022.
  • C. Peng, and S. Wu. Impact of Digital Technology to the Multiple Expression of Interactive Art Forms. Packaging and design, 2018.
  • J. Cheng, Z. Han, Z. Wang, and L. Chen, “one-shot super-resolution via backward style transfer for fast high-resolution style transfer,” IEEE Signal Process Lett., Vol. 28, pp. 1485–1489, 2021.
  • H. Bai, L. Zhang, J. Yang, and M. Billinghurst, “Bringing full-featured mobile phone interaction into virtual reality,” Comput. Graph., Vol. 97, pp. 42–53, 2021.
  • Y. Chen, Y. Pu, D. Xu, et al., “Heavy color painting style transfer algorithm,” J Comput-Aided Des Comput Graph, Vol. 31, no. 5, 2019.
  • Y. Liu, “Improved generative adversarial network and its application in image oil painting style transfer,” Image. Vis. Comput., Vol. 105, pp. 104087, 2021.
  • C. Chen, and K. C. Soo, “The research on human engineering-based design of website interface of smart cellphone,” J. Phys: Conf. Ser., Vol. 1098, no. 1, pp. 012012, 2018, September.
  • C. S. Feng, T. W. Hu, C. Y. Tsai, and Y. R. Chen, “Research for Interactive Interface Design of Mobile Instant Messaging Software—An example of LINE Communication Software,” in Education and Awareness of Sustainability: Proceedings of the 3rd Eurasian Conference on Educational Innovation 2020 (ECEI 2020), 2020, pp. 599–602.
  • G. Alshi, M. Dandiwala, M. Cazi, and R. Pawar, “Interactive augmented reality-based system for traditional educational media using marker-derived contextual overlays,” in 2018 Second International Conference on Electronics, Communication and Aerospace Technology (ICECA), IEEE, 2018, March, pp. 930–5.
  • M. Nosheen, Z. Sayed, M. S. Malik, and M. A. Fahiem, “An evaluation model for measuring the usability of mobile office applications through user Interface design metrics,” Mehran Univ. Res. J. Eng. Technol., Vol. 38, no. 3, pp. 641–654, 2019.
  • Z. Ahmad, N. Khanna, D. A. Kerr, C. J. Boushey, and E. J. Delp, “A mobile phone user interface for image-based dietary assessment,” in Mobile Devices and Multimedia: Enabling Technologies, Algorithms, and Applications 2014, 9030, February. SPIE, 2014, pp. 44–52.
  • H. M. Salman, W. F. W. Ahmad, and S. Sulaiman, “Usability evaluation of the smartphone user interface in supporting elderly users from experts’ perspective,” IEEE. Access., Vol. 6, pp. 22578–22591, 2018.
  • B. Abdesselam, and M. S. A. Aziz, “Mobile interface design to suit the Algerian culture: first initial design,” Int. J. Percept. Cognit. Comput., Vol. 6, no. 2, pp. 1–7, 2020.
  • W. Jung, and H. R. Yim, “An exploratory study of the interface design factors affecting the user intention to use mobile applications,” Int. J. Adv. Sci. Technol., Vol. 119, pp. 103–110, 2018.
  • H. H. Ho, and S. Y. Tzeng, “Using the Kano model to analyze the user interface needs of middle-aged and older adults in mobile reading,” Comput. Hum. Behav. Rep., Vol. 3, pp. 100074, 2021.
  • P. Li, D. Zhang, L. Zhao, D. Xu, and D. Lu, “Style permutation for diversified arbitrary style transfer,” IEEE. Access., Vol. 8, pp. 199147–199158, 2020.
  • N. Al Said, and K. Al-Said, Assessment of acceptance and user experience of human-computer interaction with a computer interface. 2020.
  • M. Soui, M. Chouchane, M. W. Mkaouer, M. Kessentini, and K. Ghedira, “Assessing the quality of mobile graphical user interfaces using multi-objective optimization,” Soft. Comput., Vol. 24, pp. 7685–7714, 2020.
  • Y. Zhang, F. Zhang, Y. Jin, Y. Cen, V. Voronin, and S. Wan, “Local correlation ensemble with GCN based on attention features for cross-domain person Re-ID,” ACM Trans. Multimedia Comput. Commun. Appl., Vol. 19, no. 2, pp. 1–22, 2023.
  • Y. Wu, L. Zhang, Z. Gu, H. Lu, and S. Wan, “Edge-AI-driven framework with efficient mobile network design for facial expression recognition,” CM Trans. Embedded Comput. Syst., Vol. 22, no. 3, pp. 1–17, 2023.
  • C. Chen, C. Wang, B. Liu, C. He, L. Cong, and S. Wan, “Edge intelligence empowered vehicle detection and image segmentation for autonomous vehicles,” IEEE Trans. Intell. Transp. Syst., 2023.
  • Y. Wu, Q. Kong, L. Zhang, A. Castiglione, M. Nappi, and S. Wan, “Cdt-cad: Context-aware deformable transformers for end-to-end chest abnormality detection on x-ray images,” IEEE/ACM Trans. Comput. Biol. Bioinf., 2023.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.