我有2.txt文件。
txt文件是curl data(robot),它总是得到2000.txt行,包括新的行
第二个.txt文件有第一个.txt文件的新数据。
我使用第二个.txt文件作为脚本。在
我不能去掉那些可疑的东西。(我的意思是我尝试根据旧值获取新值)所以脚本总是使用新的和旧的数据。在
是否有方法打开所有文件,删除重复项并将行相应地保存到第二个文件中?
有三个例子here is FIRST refresh and 2 .txt files
first.txt文件(您应该认为它有2000行)refresh curl robotSomething here10
Something here9
Something here8
Something here7
Something here6
Something here5
Something here4
Something here3
Something here2
Something here1
我将使用的第二个.txt文件Something here10
Something here9
Something here8
Something here7
Something here6
Something here5
Something here4
Something here3
Something here2
Something here1here is SECOND refresh and 2 .txt files
first.txt文件(您应该认为它有2000行)refresh curl botSomething here14
Something here13
Something here12
Something here11
Something here10
Something here9
Something here8
Something here7
Something here6
Something here5
我将使用的第二个.txt文件Something here14
Something here13
Something here12
Something here11here is THIRD refresh and 2 .txt files
first.txt文件(您应该认为它有2000行)refresh curl botSomething here16
Something here15
Something here14
Something here13
Something here12
Something here11
Something here10
Something here9
Something here8
Something here7
我将使用的第二个.txt文件Something here16
Something here15
编辑:
我发布了两个新的更新here is FOURTH refresh and 2 .txt files
first.txt文件(您应该认为它有2000行)refresh curl botSomething here20
Something here19
Something here18
Something here17
Something here16
Something here15
Something here14
Something here13
Something here12
Something here11
我将使用的第二个.txt文件Something here20
Something here19
Something here18
Something here17here is FIFTH refresh and 2 .txt files
first.txt文件(您应该认为它有2000行)refresh curl botSomething here24
Something here23
Something here22
Something here21
Something here20
Something here19
Something here18
Something here17
Something here16
Something here15
我将使用的第二个.txt文件Something here24
Something here23
Something here22
Something here21