Press ESC to close

2D-3D Interlaced Transformer

Traditional Machine Learning Models and Bidirectional Encoder Representations From Transformer (BERT)-Based Automatic Classification of Tweets About Eating Disorders: Algorithm Development and Validation Study

root 0

The study aimed to identify machine learning models that could efficiently categorize tweets concerning eating disorders. Over a million tweets were collected over three months and classified using both traditional…

Continue reading

Self-Attention through Kernel-Eigen Pair Sparse Variational Gaussian Processes

root 0

The study proposes Kernel-Eigen Pair Sparse Variational Gaussian Processes (KEP-SVGP) for building uncertainty-aware self-attention in transformers. The asymmetry of attention kernels is addressed using Kernel SVD (KSVD), yielding reduced complexity….

Continue reading