We are improving our search experience. To check which content you have full access to, or for advanced search, go back to the old search.

Search

Please fill in this field.
Filters applied:

Search Results

Showing 1-20 of 10,000 results
  1. Attention and self-attention in random forests

    New models of random forests jointly using the attention and self-attention mechanisms are proposed for solving the regression problem. The models...

    Lev V. Utkin, Andrei V. Konstantinov, Stanislav R. Kirpichenko in Progress in Artificial Intelligence
    Article 11 May 2023
  2. Conv-Attention: A Low Computation Attention Calculation Method for Swin Transformer

    Transformer networks have excellent performance in various different vision tasks, especially object detection. However, in practical applications,...

    Zhehang Lou, Suyun Luo, ... Dan Wei in Neural Processing Letters
    Article Open access 24 February 2024
  3. Layer-Wise External Attention by Well-Localized Attention Map for Efficient Deep Anomaly Detection

    The external attention mechanism offers a promising approach to enhance image anomaly detection (Hayakawa et al., in: IMPROVE, pp. 100-–110, 2023)....

    Keiichi Nakanishi, Ryo Shiroma, ... Terumasa Tokunaga in SN Computer Science
    Article Open access 28 May 2024
  4. Multi-granularity attention in attention for person re-identification in aerial images

    In marrying with Unmanned Aerial Vehicles ( UAV s), the person re-identification ( re-ID ) techniques are further strengthened in terms of mobility....

    Simin Xu, Lingkun Luo, ... Shiqiang Hu in The Visual Computer
    Article 19 September 2023
  5. Fault-attri-attention: a method for fault identification based on seismic attributes attention

    The imaging principle of seismic images is different from natural images, which results in very limited resolution, complex reflection features and...

    **ao Li, Kewen Li in Neural Computing and Applications
    Article 06 December 2023
  6. Hydra Attention: Efficient Attention with Many Heads

    While transformers have begun to dominate many tasks in vision, applying them to large images is still computationally difficult. A large reason for...
    Daniel Bolya, Cheng-Yang Fu, ... Judy Hoffman in Computer Vision – ECCV 2022 Workshops
    Conference paper 2023
  7. Visual attention network

    While originally designed for natural language processing tasks, the self-attention mechanism has recently taken various computer vision areas by...

    Meng-Hao Guo, Cheng-Ze Lu, ... Shi-Min Hu in Computational Visual Media
    Article Open access 28 July 2023
  8. Attention-guided Erasing

    The assessment of breast density is crucial in the context of breast cancer screening, especially in populations with a higher percentage of dense...
    Adarsh Bhandary Panambur, Hui Yu, ... Andreas Maier in Bildverarbeitung für die Medizin 2024
    Conference paper 2024
  9. Attention and Distraction

    Attention is a precious mental resource. Every day we are bombarded with information, decisions, distractions, and things we need to remember. What...
    Scott Riley in Mindful Design
    Chapter 2024
  10. Vision Transformers with Hierarchical Attention

    This paper tackles the high computational/space complexity associated with multi-head self-attention (MHSA) in vanilla vision transformers. To this...

    Yun Liu, Yu-Huan Wu, ... Luc Van Gool in Machine Intelligence Research
    Article Open access 19 April 2024
  11. Attention-Based DCNs

    This chapter first provides a brief overview of attention-based Deep Cognitive Networks (DCNs). Then, representative models from two aspects in terms...
    Yan Huang, Liang Wang in Deep Cognitive Networks
    Chapter 2023
  12. Recent advancements in driver’s attention prediction

    Ensuring the precise anticipation of a driver’s attention is crucial for upholding safety in diverse human-centric transportation scenarios. This...

    Morteza Moradi, Simone Palazzo, ... Concetto Spampinato in Multimedia Tools and Applications
    Article 18 May 2024
  13. Point-attention Net: a graph attention convolution network for point cloudsegmentation

    Point cloud classification and segmentation is a crucial yet challenging step towards 3D scene understanding. In response to the fact that 3D point...

    Suting Chen, Zelin Miao, ... Yanyan Zhang in Applied Intelligence
    Article 03 September 2022
  14. Self-Enhanced Attention for Image Captioning

    Image captioning, which involves automatically generating textual descriptions based on the content of images, has garnered increasing attention from...

    Qingyu Sun, Juan Zhang, ... Yongbin Gao in Neural Processing Letters
    Article Open access 01 April 2024
  15. Multi-Keys Attention Network for Image Captioning

    The image captioning task aims to generate descriptions from the main content of images. Recently, the Transformer with a self-attention mechanism...

    Ziqian Yang, Hui Li, ... Jimin **ao in Cognitive Computation
    Article 24 January 2024
  16. Efficient Attention for Domain Generalization

    Deep neural networks suffer severe performance degradation when encountering domain shift. Previous methods mainly focus on feature manipulation in...
    Zhongqiang Zhang, Ge Liu, ... **angzhong Fang in Neural Information Processing
    Conference paper 2024
  17. Masked cross-attention and multi-head channel attention guiding single-stage generative adversarial networks for text-to-image generation

    Although the text-to-image model aims to generate realistic images that correspond to the text description, generating high-quality, and accurate...

    Shouming Hou, Ziying Li, ... Hui Li in The Visual Computer
    Article 09 February 2024
  18. Graph attention information fusion for Siamese adaptive attention tracking

    A single target tracker based on a Siamese network regards tracking as a process of similarity matching. The convolution features of the template...

    Lixin Wei, Zeyu **, ... Hao Sun in Applied Intelligence
    Article 05 May 2022
  19. Quantum self-attention neural networks for text classification

    An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including...

    Guangxi Li, Xuanqiang Zhao, **n Wang in Science China Information Sciences
    Article 27 March 2024
  20. An attention matrix for every decision: faithfulness-based arbitration among multiple attention-based interpretations of transformers in text classification

    Transformers are widely used in natural language processing, where they consistently achieve state-of-the-art performance. This is mainly due to...

    Nikolaos Mylonas, Ioannis Mollas, Grigorios Tsoumakas in Data Mining and Knowledge Discovery
    Article 28 August 2023
Did you find what you were looking for? Share feedback.