Replies: 8 comments 37 replies
-
Hi, How does it compare to the situation, you had before?
Have you tried lower values? Have you tried with another CI, or building locally? 360 docs is already a large site, and everytime you'll cut a new version you'll increase the build time. Unfortunately we can't do much to avoid making the build time increase in such case. If this becomes a problem you can take older versions and publish them as standalone sites. At some point if your site has many docs, many versions, many languages, yes you will have large build times without splitting the site in smaller subsites. |
Beta Was this translation helpful? Give feedback.
-
@slorber I've gotten the go-ahead to provide you with the sources as long as I can send it to you personally. How should I send it to you? |
Beta Was this translation helpful? Give feedback.
-
just try
which reduce my building time from 30+m to 3m. |
Beta Was this translation helpful? Give feedback.
-
I think we encountered the same problem. We recently moved to Docusarus for Puppeteer and we try to generate a versioned documentation website. At first, the performance seemed good enough but as soon we started publishing more versions the process started to take more memory and time, to the point, when building the site using GitHub Actions fails with out-of-memory errors or takes many hours (not sure if it's hanging or if actually building the site). I wonder if there are workarounds to this problem? The source of the site is here: https://github.com/puppeteer/puppeteer/tree/main/website |
Beta Was this translation helpful? Give feedback.
-
I think we have a similar issue. We are trying to use Docusaurus to generate an internal site with both hand-written documentation files and auto-generated markdown files for API reference (something like the MSFT docs site). We have ~3100 markdown files in total, most of which are the auto-generated API reference docs. With this configuration, production build is very slow (20+ minutes) and often runs out of memory. |
Beta Was this translation helpful? Give feedback.
-
@slorber / @RDIL any thoughts on next steps here?
|
Beta Was this translation helpful? Give feedback.
-
This morning on my CICD.
Is there a way to get more details on steps executed during the build and their time? |
Beta Was this translation helpful? Give feedback.
-
More ways to speed up the build? i.e. build multiple locales in parallel? |
Beta Was this translation helpful? Give feedback.
-
So I just finished deploying a new version of our docs (https://support.touchgfx.com/) which introduces versioning. It functions well but I noticed some concerning values when it comes to build times and memory usage. This particular version uses
docusaurus alpha.54
, so if these issues have been relieved in newer versions, I apologise (we will upgrade to newest Docuaurus version before our next release).First of all, we have to use the
--max_old_space_size=16000
option on the node command to get it to finish, else it runs out of memory. It also took about 26 minutes to build the documentation as can be seen in the following image:Furthermore, I saw upwards of 10gb of ram being used at some points during the build.
For reference, this is building two versions of our docs (each with about 360 .mdx files) + the
next
version which we don't use (it seems I can disable building anext
version in a newer Docusaurus version which is nice).So it seems that adding versions is really tough for the build times and resources, so I thought I'd bring some awareness to this issue if you hadn't already considered it 😄
Really appreciate the great work you guys are doing!
Beta Was this translation helpful? Give feedback.
All reactions