<bdo id='aAr6G'></bdo><ul id='aAr6G'></ul>

      <legend id='aAr6G'><style id='aAr6G'><dir id='aAr6G'><q id='aAr6G'></q></dir></style></legend>
        <tfoot id='aAr6G'></tfoot>
      1. <i id='aAr6G'><tr id='aAr6G'><dt id='aAr6G'><q id='aAr6G'><span id='aAr6G'><b id='aAr6G'><form id='aAr6G'><ins id='aAr6G'></ins><ul id='aAr6G'></ul><sub id='aAr6G'></sub></form><legend id='aAr6G'></legend><bdo id='aAr6G'><pre id='aAr6G'><center id='aAr6G'></center></pre></bdo></b><th id='aAr6G'></th></span></q></dt></tr></i><div id='aAr6G'><tfoot id='aAr6G'></tfoot><dl id='aAr6G'><fieldset id='aAr6G'></fieldset></dl></div>

        <small id='aAr6G'></small><noframes id='aAr6G'>

        MySQL 加载数据 infile - 加速?

        MySQL load data infile - acceleration?(MySQL 加载数据 infile - 加速?)

        • <i id='GC5cN'><tr id='GC5cN'><dt id='GC5cN'><q id='GC5cN'><span id='GC5cN'><b id='GC5cN'><form id='GC5cN'><ins id='GC5cN'></ins><ul id='GC5cN'></ul><sub id='GC5cN'></sub></form><legend id='GC5cN'></legend><bdo id='GC5cN'><pre id='GC5cN'><center id='GC5cN'></center></pre></bdo></b><th id='GC5cN'></th></span></q></dt></tr></i><div id='GC5cN'><tfoot id='GC5cN'></tfoot><dl id='GC5cN'><fieldset id='GC5cN'></fieldset></dl></div>
          1. <small id='GC5cN'></small><noframes id='GC5cN'>

              <bdo id='GC5cN'></bdo><ul id='GC5cN'></ul>
            • <tfoot id='GC5cN'></tfoot>

              <legend id='GC5cN'><style id='GC5cN'><dir id='GC5cN'><q id='GC5cN'></q></dir></style></legend>
                <tbody id='GC5cN'></tbody>

                  本文介绍了MySQL 加载数据 infile - 加速?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  限时送ChatGPT账号..

                  有时,我必须为一个项目重新导入数据,从而将大约 360 万行读入 MySQL 表(目前是 InnoDB,但我实际上并不仅限于此引擎).加载数据文件..."已被证明是最快的解决方案,但它有一个权衡:- 在没有密钥的情况下导入时,导入本身大约需要 45 秒,但密钥创建需要很长时间(已经运行了 20 分钟......).- 使用表上的键进行导入会使导入速度变慢

                  sometimes, I have to re-import data for a project, thus reading about 3.6 million rows into a MySQL table (currently InnoDB, but I am actually not really limited to this engine). "Load data infile..." has proved to be the fastest solution, however it has a tradeoff: - when importing without keys, the import itself takes about 45 seconds, but the key creation takes ages (already running for 20 minutes...). - doing import with keys on the table makes the import much slower

                  表的 3 个字段上有键,引用数字字段.有什么办法可以加速吗?

                  There are keys over 3 fields of the table, referencing numeric fields. Is there any way to accelerate this?

                  另一个问题是:当我终止启动慢查询的进程时,它继续在数据库上运行.有没有什么办法不用重启mysqld就可以终止查询?

                  Another issue is: when I terminate the process which has started a slow query, it continues running on the database. Is there any way to terminate the query without restarting mysqld?

                  非常感谢数据库

                  推荐答案

                  如果您使用的是 innodb 和批量加载,这里有一些提示:

                  if you're using innodb and bulk loading here are a few tips:

                  将你的 csv 文件按目标表的主键顺序排序:记住 innodb 使用聚集的主键,所以如果它被排序,它会加载得更快!

                  sort your csv file into the primary key order of the target table : remember innodb uses clustered primary keys so it will load faster if it's sorted !

                  我使用的典型加载数据文件:

                  typical load data infile i use:

                  truncate <table>;
                  
                  set autocommit = 0;
                  
                  load data infile <path> into table <table>...
                  
                  commit;
                  

                  可用于加快加载时间的其他优化:

                  other optimisations you can use to boost load times:

                  set unique_checks = 0;
                  set foreign_key_checks = 0;
                  set sql_log_bin=0;
                  

                  将 csv 文件拆分成更小的块

                  split the csv file into smaller chunks

                  我在批量加载期间观察到的典型导入统计数据:

                  typical import stats i have observed during bulk loads:

                  3.5 - 6.5 million rows imported per min
                  210 - 400 million rows per hour
                  

                  这篇关于MySQL 加载数据 infile - 加速?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

                  相关文档推荐

                  Can#39;t Create Entity Data Model - using MySql and EF6(无法创建实体数据模型 - 使用 MySql 和 EF6)
                  MySQL select with CONCAT condition(MySQL选择与CONCAT条件)
                  Capitalize first letter of each word, in existing table(将现有表格中每个单词的首字母大写)
                  How to retrieve SQL result column value using column name in Python?(如何在 Python 中使用列名检索 SQL 结果列值?)
                  Update row with data from another row in the same table(使用同一表中另一行的数据更新行)
                  Exporting results of a Mysql query to excel?(将 Mysql 查询的结果导出到 excel?)
                  1. <legend id='Fdcg3'><style id='Fdcg3'><dir id='Fdcg3'><q id='Fdcg3'></q></dir></style></legend>

                    <tfoot id='Fdcg3'></tfoot>
                    • <bdo id='Fdcg3'></bdo><ul id='Fdcg3'></ul>

                    • <small id='Fdcg3'></small><noframes id='Fdcg3'>

                      <i id='Fdcg3'><tr id='Fdcg3'><dt id='Fdcg3'><q id='Fdcg3'><span id='Fdcg3'><b id='Fdcg3'><form id='Fdcg3'><ins id='Fdcg3'></ins><ul id='Fdcg3'></ul><sub id='Fdcg3'></sub></form><legend id='Fdcg3'></legend><bdo id='Fdcg3'><pre id='Fdcg3'><center id='Fdcg3'></center></pre></bdo></b><th id='Fdcg3'></th></span></q></dt></tr></i><div id='Fdcg3'><tfoot id='Fdcg3'></tfoot><dl id='Fdcg3'><fieldset id='Fdcg3'></fieldset></dl></div>
                            <tbody id='Fdcg3'></tbody>