本文介绍了python脚本将目录中的所有文件连接到一个文件中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!
问题描述
I have written the following script to concatenate all the files in the directory into one single file.
Can this be optimized, in terms of
idiomatic python
time
Here is the snippet:
import time, glob
outfilename = 'all_' + str((int(time.time()))) + ".txt"
filenames = glob.glob('*.txt')
with open(outfilename, 'wb') as outfile:
for fname in filenames:
with open(fname, 'r') as readfile:
infile = readfile.read()
for line in infile:
outfile.write(line)
outfile.write("
")
解决方案
Use shutil.copyfileobj
to copy data:
import shutil
with open(outfilename, 'wb') as outfile:
for filename in glob.glob('*.txt'):
if filename == outfilename:
# don't want to copy the output into the output
continue
with open(filename, 'rb') as readfile:
shutil.copyfileobj(readfile, outfile)
shutil
reads from the readfile
object in chunks, writing them to the outfile
fileobject directly. Do not use readline()
or a iteration buffer, since you do not need the overhead of finding line endings.
Use the same mode for both reading and writing; this is especially important when using Python 3; I've used binary mode for both here.
这篇关于python脚本将目录中的所有文件连接到一个文件中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!
本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!