{"id":2955,"date":"2023-07-24T06:28:07","date_gmt":"2023-07-24T06:28:07","guid":{"rendered":"http:\/\/kedar.nitty-witty.com\/?p=2955"},"modified":"2026-01-16T11:12:58","modified_gmt":"2026-01-16T11:12:58","slug":"how-to-overcome-throttling-and-rate-exceeded-errors-in-downloaddblogfileportion","status":"publish","type":"post","link":"https:\/\/kedar.nitty-witty.com\/blog\/how-to-overcome-throttling-and-rate-exceeded-errors-in-downloaddblogfileportion","title":{"rendered":"How to overcome Throttling and Rate Exceeded Errors in DownloadDBLogFilePortion"},"content":{"rendered":"\n<p>I was attempting to download the MySQL slow query logs to perform a slow query review. In this blog we will explore the issue I faced while downloading the slow logs and my workaround to solve it. I am also looking for better approaches that you take to perform similar operations.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Downloading Slow Logs for MySQL from AWS<\/h2>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>If you&#8217;re logging RDS slow logs in MySQL table read, <a href=\"http:\/\/kedar.nitty-witty.com\/blog\/mysql-slow-query-log-export-and-review-in-rds\" target=\"_blank\" rel=\"noopener\" title=\"\">Downloading slow logs from RDS<\/a><\/p>\n<\/blockquote>\n\n\n\n<p>For MySQL databases hosted on AWS RDS, AWS CLI provides commands to download slow query logs for in-depth analysis. With simple commands, such as &#8220;describe-db-log-files&#8221; and &#8220;download-db-log-file-portion,&#8221; administrators can efficiently retrieve and review slow logs, gaining valuable insights into database performance and query optimization. <\/p>\n\n\n\n<p>I began by using the AWS CLI command to describe the database log files and extract the slow log file names.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>aws rds describe-db-log-files --db-instance-identifier rds_slow_log_test | grep 'FileName' | grep slow | awk -F'\"' '{print $4}'<\/code><\/pre>\n\n\n\n<p>Further extended the script to share the parsed list<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>for slowlog in $(aws rds describe-db-log-files --db-instance-identifier rds_slow_log_test | grep 'FileName' | grep slow | awk -F'\"' '{print $4}'); do echo \"Downloading $slowlog\"; aws rds download-db-log-file-portion --db-instance-identifier rds_slow_log_test --starting-token 0 --output text --log-file-name $slowlog &gt; $slowlog; done;<\/code><\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">Throttling error about Rate exceeded<\/h2>\n\n\n\n<p>Though while executing the download-db-log-file-portion command, I received following errors:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>An error occurred (Throttling) when calling the DownloadDBLogFilePortion operation (reached max retries: 2): Rate exceeded\nDownloading slowquery\/mysql-slowquery.log.2023-07-23.15<\/code><\/pre>\n\n\n\n<p>This caused the incomplete download of slow query logs.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>-rw-rw-r--. 1 centos centos 100M Jul 24 05:13 mysql-slowquery.log.2023-07-23.15\n-rw-rw-r--. 1 centos centos 30M  Jul 24 05:25 mysql-slowquery.log.2023-07-23.16\n-rw-rw-r--. 1 centos centos 150M Jul 24 05:38 mysql-slowquery.log.2023-07-23.17\n-rw-rw-r--. 1 centos centos 110M Jul 24 05:49 mysql-slowquery.log.2023-07-23.18\n-rw-rw-r--. 1 centos centos 210M Jul 24 05:58 mysql-slowquery.log.2023-07-23.19<\/code><\/pre>\n\n\n\n<p>The slow logs were not fully downloaded due to the error mentioned above. In the <a href=\"POST link\" target=\"_blank\" rel=\"noopener nofollow\" title=\"\">re:POST<\/a>, AWS talks a bit about rate exceeded exception <\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Fixing throttling, rate exceeded error<\/h2>\n\n\n\n<p>To overcome the Throttling error, AWS suggests implementing error retries and exponential backoff when making API calls. One of the key environment variables to configure is AWS_MAX_ATTEMPTS, which specifies the maximum number of attempts to make on a request.<\/p>\n\n\n\n<p>The environment variable AWS_MAX_ATTEMPTS specifies the maximum number attempts to make on a request. The error mentions that those retries were only &#8220;2&#8221; earlier which can be tweaked. So, I rerun the following command after exporting AWS_MAX_ATTEMPTS to a some-what larger number.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>export AWS_MAX_ATTEMPTS=100;\nfor slowlog in $(aws rds describe-db-log-files --db-instance-identifier rds_slow_log_test | grep 'FileName' | grep slow | awk -F'\"' '{print $4}'); do echo \"Downloading $slowlog\"; aws rds download-db-log-file-portion --db-instance-identifier rds_slow_log_test --starting-token 0 --output text --log-file-name $slowlog &gt; $slowlog; done;<\/code><\/pre>\n\n\n\n<p>and the slow logs were fully downloaded<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>-rw-rw-r--. 1 centos centos 5.1G Jul 24 06:23 mysql-slowquery.log.2023-07-23.15<br>-rw-rw-r--. 1 centos centos 5.5G Jul 24 06:45 mysql-slowquery.log.2023-07-23.16<br>-rw-rw-r--. 1 centos centos 5.3G Jul 24 07:07 mysql-slowquery.log.2023-07-23.17<br>-rw-rw-r--. 1 centos centos 5.1G Jul 24 07:28 mysql-slowquery.log.2023-07-23.18<br>-rw-rw-r--. 1 centos centos 5.2G Jul 24 07:49 mysql-slowquery.log.2023-07-23.19<\/code><\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">Git script to download MySQL slow logs<\/h2>\n\n\n\n<p>I also have this old script uses the same aws command to download-db-log-file-portion which is updated with change to avoid rate throttling and rate exceeded errors: <a href=\"https:\/\/github.com\/kedarvj\/AWS-scripts\/blob\/master\/get_rds_slow_log\" target=\"_blank\" rel=\"noreferrer noopener\">get_rds_slow_log<\/a>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Conclusion<\/h2>\n\n\n\n<p>The journey to download large slow query logs can be fraught with rate limit challenges. However, by implementing AWS&#8217;s recommended adjusting AWS_MAX_ATTEMPTS, I successfully overcame the Throttling and Rate Exceeded errors. This experience has proven valuable, and I hope this blog helps others facing a similar roadblock. Feel free to share your own strategies for tackling this issue and optimizing slow query log reviews.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">References<\/h2>\n\n\n\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/docs.aws.amazon.com\/sdkref\/latest\/guide\/feature-retry-behavior.html\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/docs.aws.amazon.com\/sdkref\/latest\/guide\/feature-retry-behavior.html<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/repost.aws\/knowledge-center\/ssm-parameter-store-rate-exceeded\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/repost.aws\/knowledge-center\/ssm-parameter-store-rate-exceeded<\/a><\/li>\n<\/ul>\n<\/div><\/div>\n<\/div><\/div>\n","protected":false},"excerpt":{"rendered":"I was attempting to download the MySQL slow query logs to perform a slow query review. In this blog we will explore the issue I faced while downloading the slow&hellip;\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[868,8,377],"tags":[629,634,637,630,635,632,636,631],"class_list":{"0":"post-2955","1":"post","2":"type-post","3":"status-publish","4":"format-standard","6":"category-aws-rds","7":"category-mysql","8":"category-mysql-articles","9":"tag-aws","10":"tag-aws-cli","11":"tag-download-slow-logs","12":"tag-downloaddblogfileportion","13":"tag-query-optimization","14":"tag-rate-exceeded","15":"tag-slow-query-logs","16":"tag-throttling"},"aioseo_notices":[],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/posts\/2955","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/comments?post=2955"}],"version-history":[{"count":7,"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/posts\/2955\/revisions"}],"predecessor-version":[{"id":3541,"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/posts\/2955\/revisions\/3541"}],"wp:attachment":[{"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/media?parent=2955"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/categories?post=2955"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/kedar.nitty-witty.com\/blog\/wp-json\/wp\/v2\/tags?post=2955"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}