Search
Search Results
-
Attention and self-attention in random forests
New models of random forests jointly using the attention and self-attention mechanisms are proposed for solving the regression problem. The models...
-
Conv-Attention: A Low Computation Attention Calculation Method for Swin Transformer
Transformer networks have excellent performance in various different vision tasks, especially object detection. However, in practical applications,...
-
Layer-Wise External Attention by Well-Localized Attention Map for Efficient Deep Anomaly Detection
The external attention mechanism offers a promising approach to enhance image anomaly detection (Hayakawa et al., in: IMPROVE, pp. 100-–110, 2023)....
-
Multi-granularity attention in attention for person re-identification in aerial images
In marrying with Unmanned Aerial Vehicles ( UAV s), the person re-identification ( re-ID ) techniques are further strengthened in terms of mobility....
-
Fault-attri-attention: a method for fault identification based on seismic attributes attention
The imaging principle of seismic images is different from natural images, which results in very limited resolution, complex reflection features and...
-
Hydra Attention: Efficient Attention with Many Heads
While transformers have begun to dominate many tasks in vision, applying them to large images is still computationally difficult. A large reason for... -
Visual attention network
While originally designed for natural language processing tasks, the self-attention mechanism has recently taken various computer vision areas by...
-
Attention-guided Erasing
The assessment of breast density is crucial in the context of breast cancer screening, especially in populations with a higher percentage of dense... -
Attention and Distraction
Attention is a precious mental resource. Every day we are bombarded with information, decisions, distractions, and things we need to remember. What... -
Vision Transformers with Hierarchical Attention
This paper tackles the high computational/space complexity associated with multi-head self-attention (MHSA) in vanilla vision transformers. To this...
-
Attention-Based DCNs
This chapter first provides a brief overview of attention-based Deep Cognitive Networks (DCNs). Then, representative models from two aspects in terms... -
Recent advancements in driver’s attention prediction
Ensuring the precise anticipation of a driver’s attention is crucial for upholding safety in diverse human-centric transportation scenarios. This...
-
Point-attention Net: a graph attention convolution network for point cloudsegmentation
Point cloud classification and segmentation is a crucial yet challenging step towards 3D scene understanding. In response to the fact that 3D point...
-
Self-Enhanced Attention for Image Captioning
Image captioning, which involves automatically generating textual descriptions based on the content of images, has garnered increasing attention from...
-
Multi-Keys Attention Network for Image Captioning
The image captioning task aims to generate descriptions from the main content of images. Recently, the Transformer with a self-attention mechanism...
-
Efficient Attention for Domain Generalization
Deep neural networks suffer severe performance degradation when encountering domain shift. Previous methods mainly focus on feature manipulation in... -
Masked cross-attention and multi-head channel attention guiding single-stage generative adversarial networks for text-to-image generation
Although the text-to-image model aims to generate realistic images that correspond to the text description, generating high-quality, and accurate...
-
Graph attention information fusion for Siamese adaptive attention tracking
A single target tracker based on a Siamese network regards tracking as a process of similarity matching. The convolution features of the template...
-
Quantum self-attention neural networks for text classification
An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including...
-
An attention matrix for every decision: faithfulness-based arbitration among multiple attention-based interpretations of transformers in text classification
Transformers are widely used in natural language processing, where they consistently achieve state-of-the-art performance. This is mainly due to...