主页

WHO自杀预防建议

前言 Preventing suicide: A resource for media professionals, 2023 update (1st ed.). (2023). World Health Organization. 原文:Preventing suicide: a resource for media professionals, update 2023 AI辅助生成 关于自杀的常见误解与事实 类别 误解 (Myth) 事实 (Fact) 中文解释 谈论自杀 Talking about suicide is a bad idea and ...

阅读更多

小红书考虑数据分布的视频观看时长预测-best paper

文章来源 文章来源:Multi-Granularity Distribution Modeling for Video Watch Time Prediction via Exponential-Gaussian Mixture Network CODE:https://github.com/BestActionNow/EGMN Introduction Motivation why to predict watch time of videos? Short-video platforms such as TikTok and KuaiShou have experienced a significant surge in popularity over recent y...

阅读更多

同辈支持的力量-两个半小时的唠嗑

今天是2025年10月13日,距离读博的报道第一天已经过去了591天,好消息是好兄弟gap一年半之后顺利上岸,师姐也顺利拿到了青基,师弟的数据也收的差不多了。 时间回到昨天,由于音乐节的海风过于自由,回到南京之后成功感冒,整个人昏昏沉沉的,睡到三点才爬起来准备换到工位接着躺尸,终于走到了九曲桥,看到对面一个家伙骑车迎面过来,似乎有点眼熟,噢,原来是上次吃饭的一个家伙。就认真的打了下招呼,以为打完招呼就会走掉,没想到他直接说 你咋看起来状态这么差,就这样开始唠了起来。他就搁哪骑在共享自行车上,我后背靠着桥的玻璃护栏,心想反正头痛干不了活,那就悠闲的开始唠嗑好了。 开始聊到了工作状态,作为好心人的我开始分享我的每日5小时工作制,以及训练专注度的方法。渐渐话题就打开了,就聊到了家庭关系,...

阅读更多

NLP-ERNIE2.0

文章来源 文章来源: [1907.12412] ERNIE 2.0: A Continual Pre-training Framework for Language Understanding CODE:https://github.com/PaddlePaddle/ERNIE Introduction Motivation Generally the pre-training of models often train the model based on the co-occurrence of words and sentences. While in fact, there are other lexical, syntactic and semantic infor...

阅读更多

NLP-对比学习-SimCSE

文章来源 SimCSE: Simple Contrastive Learning of Sentence Embeddings [GitHub - princeton-nlp/SimCSE: EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821 Introduction Our unsupervised SimCSE simply predicts the input sentence itself with only dropout (Srivastava et al., 2014) used as noise (Figur...

阅读更多

NLP-RoBERTa

论文:[1907.11692] RoBERTa: A Robustly Optimized BERT Pretraining Approach 代码:GitHub - facebookresearch/fairseq: Facebook AI Research Sequence-to-Sequence Toolkit written in Python. Introduction Motivation: Training is computationally expensive, often done on private datasets of different sizes, and, as we will show, hyperparameter choices have ...

阅读更多

NLP-ERNIE

文章来源 文章链接:[1904.09223] ERNIE: Enhanced Representation through Knowledge Integration code: GitHub - PaddlePaddle/ERNIE: The official repository for ERNIE 4.5 and ERNIEKit – its industrial-grade development toolkit based on PaddlePaddle. Introduction Motivation These works do not consider the prior knowledge in the sentence. For example, In t...

阅读更多

NLP-Albert

l 论文:[1909.11942] ALBERT: A Lite BERT for Self-supervised Learning of Language Representations 代码:https://github.com/google-research/ALBERT Introduction Motivation: 应对GPU/TPU memory limitations and longer training times。this study present two parameter reduction techniques to lower memory consumption and increase the training speed of BERT...

阅读更多