Hi,

Currently, we need more than 8GB memory to compile the website,

and it is time-consuming...

We can expect that the time cost will be more and more if we release more
versions.
It is because we need to compile the docs from 0.8, 0.9, 0.10, etc.. up to
the master (we forget to rename the branch name....)

However, many docs, e.g., markdown files on 0.8, 0.9 and 0.10 branches are
rarely modified. That is to say, we do not need to compile them every time.

Therefore, I'd like to call a re-organization about compiling the website.

1. modify the site/pom.xml, put compiling docs of each version into
different profiles.

2. for each profile, only compile the doc of one version:

   - download the doc (zip file) from github;
   - unpack the zip file and copy to target/.... folders
   - compile
   - copy generated HTML files to the final output directory.

3.1 if using `scm-publish` plugin, and we allow just compile docs of some
versions (i.e., enable some profiles), we can download current htmls from
the `iotdb-website` repo and then merge the htmls with new generated files,
and then call `scm-publish` to cover all the files in the website repo;

3.2 if we use `git push` command to upload new files to the website repo,
it is easier. But, it may lead to some unuseful files left in the website
repo.


I think this logic can work.  and it will solve the problem that compiling
the website takes too much memory.


Then we can use GitHub action to automatically compile and update the
website (Jenkins will also work).


I have created an issue on jira[1]. Call for contributors. (Need to know
how to write pom file, and a very very little knowledge about using npm )

[1] https://issues.apache.org/jira/browse/IOTDB-1009

Best,
-----------------------------------
Xiangdong Huang
School of Software, Tsinghua University

 黄向东
清华大学 软件学院

Reply via email to