sergehuber commented on code in PR #647: URL: https://github.com/apache/unomi/pull/647#discussion_r1312702554
########## manual/src/main/asciidoc/migrations/migrate-1.6-to-2.0.adoc: ########## @@ -22,58 +22,58 @@ There are two main steps in preparing your migration to Apache Unomi 2.0: === Updating applications consuming Unomi -Since Apache Unomi is an engine, you've probably built multiple applications consuming its APIs, you might also have built extensions directly running in Unomi. +Since Apache Unomi is an engine, you've probably built multiple applications consuming its APIs, you might also have built extensions directly running in Unomi. -As you begin updating applications consuming Apache Unomi, it is generally a good practice to <<Enable debug mode>>. +As you begin updating applications consuming Apache Unomi, it is generally a good practice to <<_enabling_debug_mode,enable debug mode>>. Doing so will display any errors when processing events (such as JSON Schema validations), and will provide useful indications towards solving issues. ==== Data Model changes -There has been changes to Unomi Data model, please make sure to review those in the << what_s_new>> section of the user manual. +There has been changes to Unomi Data model, please make sure to review those in the <<_whats_new_in_apache_unomi_2_0,What's new in Unomi 2>> section of the user manual. ==== Create JSON schemas Once you updated your applications to align with Unomi 2 data model, the next step will be to create the necessary JSON Schemas. -Any event (and more generally, any object) received through Unomi public endpoints do require a valid JSON schema. -Apache Unomi ships, out of the box, with all of the necessary JSON Schemas for its own operation but you will need to create schemas for any custom event you may be using. +Any event (and more generally, any object) received through Unomi public endpoints do require a valid JSON schema. +Apache Unomi ships, out of the box, with all of the necessary JSON Schemas for its own operation as well as all event types generated from the Apache Unomi Web Tracker but you will need to create schemas for any custom event you may be using. -When creating your new schemas, reviewing debug messages in the logs (using: `log:set DEBUG org.apache.unomi.schema.impl.SchemaServiceImpl` in Karaf console), -will point to errors in your schemas or will help you diagnose why the events are not being accepted. +When creating your new schemas, you can setup debug messages in the logs (using: `log:set DEBUG org.apache.unomi.schema.impl.SchemaServiceImpl` in Karaf console) that +will point to errors in your schemas or will help you diagnose why the events are not being accepted. It is also possible to use the UNOMI_LOGS_JSONSCHEMA_LEVEL environment variable (by setting it to the `DEBUG` value) and then restarting Apache Unomi to accomplish the same thing. The second option is especially useful when using Docker containers. It is also possible to test if your events are valid with the a new API endpoint mapped at `/cxs/jsonSchema/validateEvent`. Note that it is currently not possible to modify or surcharge an existing system-deployed JSON schema via the REST API. It is however possible to deploy new schemas and manage them through the REST API on the `/cxs/jsonSchema` endpoint. -If you are currently using custom properties on an Apache Unomi-provided event type, +If you are currently using custom properties on an Apache Unomi-provided event type, you will need to either change to use a new custom eventType and create the corresponding schema or to create a Unomi schema extension. You can find more details in the <<JSON schemas,JSON Schema>> section of this documentation. -You can use, as a source of inspiration for creating new schemas, Apache Unomi 2.0 schema located at: +You can use, as a source of inspiration for creating new schemas, Apache Unomi 2.0 schema located at: https://github.com/apache/unomi/tree/master/extensions/json-schema/services/src/main/resources/META-INF/cxs/schemas[extensions/json-schema/services/src/main/resources/META-INF/cxs/schemas]. -Finally, and although it is technically feasible, we recommend against creating permissive JSON Schemas allowing any event payload. This requires making sure that you don't allow open properties by using JSON schema keywords such as https://json-schema.org/understanding-json-schema/reference/object.html#unevaluated-properties[unevaluated properties] +Finally, and although it is technically feasible, we recommend against creating permissive JSON Schemas allowing any event payload. This requires making sure that you don't allow undeclared properties by setting JSON schema keywords such as https://json-schema.org/understanding-json-schema/reference/object.html#unevaluated-properties[unevaluated properties] to `false`. === Migrating your existing data ==== Elasticsearch version and capacity -While still using Unomi 1.6, the first step will be to upgrade your Elasticsearch to 7.17.5. +While still using Unomi 1.6, the first step will be to upgrade your Elasticsearch to 7.17.5. Documentation is available on https://www.elastic.co/guide/en/elasticsearch/reference/7.17/setup-upgrade.html[Elasticsearch's website]. -Your Elasticsearch cluster must have enough capacity to handle the migration. -At a minimum, the required capacity must be greater than the size of the dataset in production + the size of the largest index. +Your Elasticsearch cluster must have enough capacity to handle the migration. +At a minimum, the required capacity storage capacity must be greater than the size of the dataset in production + the size of the largest index and any other settings should at least be as big as the source setup (preferably higher). Review Comment: I fixed this by breaking to another sentence. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
