问题描述
I'm migrating from MATLAB to ipython and before taking the leap I'm going through my minimal workflow to make sure every operation I perform daily on MATLAB for data crunching is available on ipython.
I'm currently stuck on the very basic task of saving and loading numpy arrays via a one-line command, such as MATLAB's:
>>> save('myresults.mat','a','b','c')
>>> load('myresults.mat')
In particular, what I like about MATLAB's load command is that not only it reads the data file but it loads the variables into the workspace, nothing else is needed to start working with them. Note that this is not the case with, for instance, numpy.load(), which requires another line to be able to assign the loaded values to the workspace variables. [ See: IPython: how to automagically load npz file and assign values to variables? ]
Based on the answers and comments to that question, I came up with this dirty-bad-engineering-ugly-coding-but-working solution. I know it's not pretty, and I would like to know if you can come up with the correct version of this [1].
I put this into iocustom.py:
def load(filename):
ip = get_ipython()
ip.ex("import numpy as np")
ip.ex("locals().update(np.load('" + filename + "'))")
so that I can run, from the ipython session:
import iocustom
load('myresults.npz')
and the variables are dumped to the workspace.
I find it hard to believe there's nothing built-in equivalent to this, and it's even harder to think that that 3-line function is the optimal solution. I would be very grateful if you could please suggest a more correct way of doing this.
Please keep in mind that:
- I'm looking for a solution which would also work inside a script and a function.
- I know there's "pickle" but I refuse to use more than one line of code for something as mundane as a simple 'save' and/or 'load' command.
- I know there's "savemat" and "loadmat" available from scipy, but I would like to migrate completely, i.e., do not work with mat files but with numpy arrays.
Thanks in advance for all your help.
[1] BTW: how do people working with ipython save and load a set of numpy arrays easily? After hours of googling I cannot seem to find a simple and straightforward solution for this daily task.
If I save this as load_on_run.py
:
import argparse
import numpy as np
if __name__=='__main__':
parser = argparse.ArgumentParser()
parser.add_argument('-l','--list', help='list variables', action='store_true')
parser.add_argument('filename')
__args = parser.parse_args()
data = np.load(__args.filename)
locals().update(data)
del parser, data, argparse, np
if __args.list:
print([k for k in locals() if not k.startswith('__')])
del __args
And then in ipython
I can invoke it with %run
:
In [384]: %run load_on_run testarrays.npz -l
['array2', 'array3', 'array4', 'array1']
In [385]: array3
Out[385]: array([-10, -9, -8, -7, -6, -5, -4, -3, -2, -1])
It neatly loads the arrays from the file into the ipython
workspace.
I'm taking advantage of the fact that magic %run
runs a script, leaving all functions and variables defined by it in the main namespace. I haven't looked into how it does this.
The script just takes a few arguments, loads the file (so far only .npz
), and uses the locals().update
trick to put its variables into the local namespace. Then I clear out the unnecessary variables and modules, leaving only the newly loaded ones.
I could probably define an alias for %run load_on_run
.
I can also imagine a script along these lines that lets you load variables with an import: from <script> import *
.
这篇关于IPython 将变量加载到工作区:你能想到比这更好的解决方案吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!