On April 2, I gave a webinar for MadCap called “Contextualizing Our Content – Today and Tomorrow.” In that webinar, I briefly reviewed how traditional context-sensitive help (CSH) and responsive design work before discussing and demo-ing advanced contextualization. That involved customizing a Flare target in real-time, without regenerating the output, in response to simulated sensor data, such as a change in temperature reported by a temperature sensor. The idea was to show how content can be customized based on input from anywhere, not just from within the computer as in CSH and responsive design. The bigger idea was to illustrate an example of contextualization in support of the Information 4.0 concept.
I won’t describe that first webinar in any more detail. You can find it in the list of recorded webinars at https://www.madcapsoftware.com/resources/recorded-webinars.aspx#flare.
Several viewers asked what other criteria could be used to customize the content. Those questions lead me to do some further testing to see what other properties of the output could be modified without the need to regenerate the output – customizing the output in real-time by modifying a control file, saving the modification, and then simply refreshing the output. I didn’t test every possibility – this was a proof-of-concept test – but here’s what I found.
The input control data comes first. This could be external sensor data, as in the webinar, but, in theory, could be any kind of data from anywhere. One viewer asked if artificial intelligence data might work. I wasn’t able to test this but it should work. (If you’re the person who asked that question, please contact me if you’d like to discuss this further.)
The next issue, and the topic of this post, is what you can modify in the output. As the webinar showed, I could modify the content based on its conditionality. Because conditions can apply to almost anything in a Flare project, almost anything should be modifiable as long as the things to be modified are specified in a control file that’s in the Output folder. For example, I was able to modify the skin banner for tablets by changing the background-color under nav.title-bar in tablet.css in output > owner > HTML5 > skins > fluid > stylesheets. Similarly, I was able to modify the skin banner for mobile by making the same change but in mobile.css.
Here’s an example, first with the skin banner colors generated in the target output.
And finally, mobile-sized – i.e. phone-sized:
Now, here’s the same output but with the tablet and mobile banner colors set in tablet.css and mobile.css as described above. (I’m leaving the desktop-sized banner in red.)
Tablet-sized (the hex value for the silver is C0C0C0):
And mobile-sized (the hex value for the orange is ffa500):
It may be possible to control other elements of an output – as I noted above, this was more of a proof- of-concept than a complete test. But it does work. And as I noted in the webinar, this offers the ability to customize our outputs in response to a wide variety of environmental data. And it does so in real-time, which may open up industrial applications for which the traditional build process was too time-consuming.
It’s also just cool…
If you’d like to discuss this further, feel free to email me at firstname.lastname@example.org.