- 博客(3)
- 收藏
- 关注
原创 【文献阅读】BERT: Pre-training of Deep Bidirectional Transformers forLanguage Understanding
1 IntroductionBert:language representation model,which stands for Bidirectional Encoder Representations from Transformers.BERT is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right co
2023-05-07 22:15:31 279 1
原创 【力扣】无重复字符的最长字串
classSolution: deflengthOfLongestSubstring(self,s:str)->int: length=len(s) i=0 temp=0 whilei<length: slist=[] foreachinrange(i,length): ifs[each]...
2022-04-21 19:50:43 83
原创 【力扣】两数相加
第一次暴力解: #Definitionforsingly-linkedlist. #classListNode: #def__init__(self,val=0,next=None): #self.val=val #self.next=next importcopy classSolution: defaddTwoNumbers(self,l1:ListNode,l2:ListNode)->...
2022-04-08 14:16:05 708
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人