Data Population Placeholders
Data Population Placeholders
File | Description |
---|---|
data-population.properties | Defines properties required by the Data Population tool for execution in the data population environment. For example, the Liquibase contexts to activate data population for this environment. |
database.properties | Defines database connection properties and the data population environment name for this environment. |
filtering.properties | Defines the values used when overriding property placeholders in data files for this environment. |
For the Data Population command line tool, the location of the environment-specific properties directory is specified in a command line parameter.
Data Filtering Property Placeholders
You can use Spring-style property placeholders, for example ${property.name} or ${property.name:defaultvalue}, in both Liquibase and Import/Export data. With these property placeholders, you can define your data once and deploy to multiple environments without using the Elastic Path Core Tool or environment-specific SQL scripts. The filtering.properties file in the environment directory stores values which replace the property placeholders.
Properties in the .m2/settings.xml in the Maven repository uses values defined in the filtering.properties file for each environment.
Conditional Execution of Data Changes
<!-- Processed for all environments --> <changeSet id="import-prod-data" author="elasticpath"> ... </changeSet> <!-- Processed only for environments with a context of "test-data" --> <changeSet id="import-test-data" author="elasticpath" context="test-data"> ... </changeSet>You can also use the context parameter to identify change sets as expand or contract, which communicate that the parameter changes the DB schema in a specific way. With zero-downtime deployments with an active database, you must not make destructive changes until the application has been fully deployed to all nodes and context parameter is used to ensure this. In this deployment process, the two phases of deployment, one to expand and to perform a rolling deployment of the application, and the second to contract and remove the unused database tables or columns, depend on the context parameter, but involves a single environment.
Repeated Execution of Data Changes
You can use the repeated execution of data changes in a shared development environment in which the team uses a single data set, such as Import/Export data, over a period of time, but want avoid resetting the database each time a data change is made. You can append .qualifier to a change set name to run the change set in one environment whenever a db-update is performed, but runs only once in other environments. For example, if a v1.0 catalog import needs to be run on every db-update, but needs to be run only once in production append qualifier to the change set name.
<changeSet id="import-prod-v1.0-catalog${prod.v1.0.data.import.qualifier}" author="elasticpath"> ... </changeSet>
Repeated Execution of Data Changes in Local
You can use the timestamp.qualifier qualifier in the filtering.properties file at extensions/database/ext-data/src/main/resources/environments/local to configure your local environment to run the change set on every db-update execution. This qualifier sets the time stamp to the current time.prod.v1.0.data.import.qualifier=${timestamp.qualifier}
Repeated Execution of Data Changes in Production
You can use a fixed value qualifier in the filtering.properties file at extensions/database/ext-data/src/main/resources/environments/production to ensure that the change set is run only once.# The fixed qualifier could be set to a specific version prod.v1.0.data.import.qualifier=-v1.0 # or it could be empty prod.v1.0.data.import.qualifier=