问题描述
我正在 Python 中实现 Kosaraju 的强连通分量 (SCC) 图搜索算法.
I am implementing Kosaraju's Strong Connected Component(SCC) graph search algorithm in Python.
该程序在小数据集上运行良好,但是当我在超大图(超过 800,000 个节点)上运行它时,它显示Segmentation Fault".
The program runs great on small data set, but when I run it on a super-large graph (more than 800,000 nodes), it says "Segmentation Fault".
这可能是什么原因?谢谢!
What might be the cause of it? Thank you!
附加信息:首先我在超大数据集上运行时遇到了这个错误:
Additional Info: First I got this Error when running on the super-large data set:
"RuntimeError: maximum recursion depth exceeded in cmp"
然后我使用重置递归限制
Then I reset the recursion limit using
sys.setrecursionlimit(50000)
但出现分段错误"
相信我,这不是一个无限循环,它在相对较小的数据上运行正确.可能是程序耗尽了资源?
Believe me it's not a infinite loop, it runs correct on relatively smaller data. It is possible the program exhausted the resources?
推荐答案
当 python extension(用 C 编写)试图访问无法访问的内存时会发生这种情况.
This happens when a python extension (written in C) tries to access a memory beyond reach.
您可以通过以下方式对其进行跟踪.
You can trace it in following ways.
- 添加
sys.settrace
在代码的第一行. 使用 Mark 在 this answer<中描述的
gdb
/a>.. 在命令提示符下
- Add
sys.settrace
at the very first line of the code. Use
gdb
as described by Mark in this answer.. At the command prompt
gdb python
(gdb) run /path/to/script.py
## wait for segfault ##
(gdb) backtrace
## stack trace of the c code
这篇关于是什么导致 Python 分段错误?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!