Monday, December 30, 2013

How Edapt Works

The gory details of Edapt migrations


Edapt is an Eclipse technology which enables 'coupled' evolution of an Ecore metalmodel and instances of the model. Coupled evolution means evolving the meta model. (.ecore) and the model instance hand-in-hand.

As we will see, Edapt implements the concept of couple evolution in a very sophisticated way. Edapt provides great flexibility with many re-usable model migration Operations. Additionally custom migration Operations are also supported.

Why this blog

This blog explains how the Edapt Migrator works. (It's not an Edapt tutorial!). The reason I wrote this blog is that I am one of the developers of Edapt and as such, I investigated how Edapt works and made many notes. Not sure where to store this study, I thought others could benefit, because the migration process is rather complex and large and can easily daze you. As it turned out for me, understanding the inner workings also helped me understand when a custom migration Operation is needed and how to implement it.

Prerequisites 


Experience with EMF is recommended. Edapt literally constructs Ecore meta models back and forth, so understanding Ecore is key. There are many tutorials. Here is one of the them.

Experience with Edapt History editor and Operations Viewer. The primary step in model migrations. The Edapt tutorial is a must.

(Note: Edapt can now be installed on the latest Eclipse release named Kepler with this plugin repository ).

Content


  • What is the Migrator?
  • Migrator concepts.
  • Migration example dissected.
  • Migrator in details.

What is the Migrator?


The Edapt documentation explains about the Edapt migrator here. What it teaches us is how to contribute a Migrator and how to execute a migration by i.e. extending editor code to detect if a Migration is needed and actually performing it. It doesn't tell us how it works, and this is what this blog is about.

For us to understand how the Migrator work I will explain the various concepts and show how it acts upon a sample metamodel and model instance.

What happens in a migration process can be summarized as:
  1. Process a History by visiting it's releases and changes.
  2. Create an inner map from the original metamodel to a constructed target metamodel in memory, which we will call the reference metamodel. 
  3. Load a model instance (through a converter) with corresponding meta model of a release if it's not the target release yet.  (Effectively determine if the migration of a model is required based on it's release).
  4. Migrate the model instance and referenced metamodel by applying primitive and operation changes. 
  5. When the target release is reached, finish the process and persist (save) the model back to a target resource. (A file identified by a URI)

The Edapt Migration State Model


One key aspect to understand in this approach is that various metamodels will be maintained during the migration process.
  1. The original .ecore metamodel which is referenced by the changes in the History. 
  2. The 'constructed' metamodel in memory which is in the state of the changes applied to it. This metamodel can be identified as the 'reference' metamodel. It is the reference for applying metamodel changes.
  3. The 'constructed' metamodel referenced by the model instances. (See MMMeta)

Migrator concepts


Edapt defines two abstractions to perform the migration. These are:
  • Metamodel Migration model (MMMeta)
  • Model Migration model (MMModel)
Both the MMMeta and the MMModel are bound together in a Repository for easy access.
Then there are the Reconstructors and Converters.

A Reconstructor processes the History model and reconstructs a certain release going forward or backward for the releases in the history. The reconstruction process acts on the mapping and the reference metamodel in case of primitive changes which do not affect a model instance. It also acts on the MMMeta and MMModel when the actual model instance is impacted.

Note: The reconstruction process is also available through the Edapt UI acting on an Ecore editor. It will in this case only reconstruct the metamodel and not the actual model instance.

Converters are required to convert back and fort a model from a MMMeta/MMModel to an EMF ResourceSet. Finally Edapt deals with persistence through a utility class named Persistency, which we will discuss as well.


   

Migration Example dissected

Here we follow an example which is part of the Edapt tests. The example Model and Meta model can be obtained from here.

History Model


Release 1 (Which is only the metamodel definition).


<releases date="2008-11-23T22:45:42.562+0100">
<changes xsi:type="history:Create" element="component.ecore#/">
(1)        <changes xsi:type="history:Set" element="component.ecore#/" featureName="name"
            dataValue="component"/>
(2)        <changes xsi:type="history:Set" element="component.ecore#/" featureName="nsURI"
            dataValue="http://component/r0"/>
(3)        <changes xsi:type="history:Set" element="component.ecore#/" featureName="nsPrefix"
            dataValue="component"/>
 </changes>

etc.... (The rest of Release 1 further builds up the Meta model


Release 2 (Which actually starts to change Release 1, Model instances conforming to Release 1 will be migratable with Operations from Release 2). 


 <releases date="2008-11-23T22:49:28.078+0100">
    <changes xsi:type="history:MigrationChange" migration="org.eclipse.emf.edapt.tests.migration.custom.ComponentSignatureCustomMigration"
       >
      <changes xsi:type="history:OperationChange">
        <changes xsi:type="history:Create" target="component.ecore#/" referenceName="eClassifiers"
            element="component.ecore#//InPort">
          <changes xsi:type="history:Set" element="component.ecore#//InPort" featureName="name"
              dataValue="InPort"/>
          <changes xsi:type="history:Add" element="component.ecore#//InPort" featureName="eSuperTypes"
              referenceValue="component.ecore#//Port"/>
        </changes>
        <operation name="newClass">
          <parameters name="ePackage">
            <referenceValue element="component.ecore#/"/>
          </parameters>
          <parameters name="name">
            <dataValue>InPort</dataValue>
          </parameters>
          <parameters name="superClasses">
            <referenceValue element="component.ecore#//Port"/>
          </parameters>
        </operation>
      </changes>

 etc...

...or as a screenshot from the History editor:


Following along


Here we illustrate the migration process step by step.  



Release 1 (Here the meta model is constructed).

1. caseCreate()
 

Change => Create (EPackage)
  • create a new EPackage
  • add it to the Metamodel Extend cache.
  • Add mapping between the EPackage in the Create Change and map it to the new factored EPackage. (Later on when loading the MMMeta, the extend is used to get the EPackage).

2 caseSet()

  • Cet the target element from the Set
  • Get the equivalent from the mapping definition.
  • set the attribute (feature) on the target element.
etc... continued construction of the meta model


Release 2 ( Here the model is also migrated, as we use Operations)
 

Change => Custom Migration

2.1 startChange() which delegates to the MigrationReconstructor.

  • caseMigrationChange()
  • load the Custom Migration (In our case "ComponentSignatureCustomMigration")
  • call migrateBefore() only. (migrateAfter is implemented by this custom migration, so nothing really happens here).
2.2 calls switch again but not delegating to other reconstructors.
  • caseMigrationChange() => null (EcoreFwReconstructor returns null).
2.3 .. back in the ForwardReconstructor
  • Iterate over the MigrationChangeChildren()
  • caseOperationChange
  • creates a copyResolve OperationInstace from the ResolverBase (It copies and resolves ECore elements from the mapping).
  • Converts the OperationInstance to an OperationImplementation.
  • calls OperationImplementation instance's checkAndExecute( ) with the model and metamodel
  • We end up on the Operation Implementation which is NewClass for this operation, it calls MetaModelFactory to create the class with all the operation parameters.

2.x endChange() which delegates to the MigrationReconstructor 

etc..

When the target release is reached, the MMModel is converted back to a valid model instance which conforms to the target release of the Ecore metamodel, which completes the migration process. 

Migrator details

We explain deeper the various Migrator concepts.

MMM's

MMMeta

The metamodel migration model (MMMeta) is a migration specific representation of the .ecore metamodel.

MMMeta Instance creation

The MMMeta instance is created with the MetamodelExtent corresponding EPackage for a model nSURI. The Metamodel instance is then available for the migration process, for example to load a model instance with the correct .ecore


MMModel

The model migration model (MMModel) is migration specific representation of a model instance conforming to one of the releases of a metamodel.

The basic idea is to group instances, attributes and references together. So a change can easily iterate over the Instances and change (In one of the changes which affects the MMModel) for example the Type of an Instance in the MMModel. Later on as we will see with the converters, the 'migrated' MMModel will be serialized back into a regular Ecore model instance and can be persisted.  

The following entities exist (Somewhat simplified).
  • Model => The MMModel 
  • Instance => Each EObject in the actual model instance will have an Instance with a Type
  • Type => Each Instance has a Type. A Type has an eClass which corresponds to the original eClass of the EObject
  • AttributeSlot => Each attribute in EObjects has an AttributeSlot with the EStructuralFeature and EJavaObject as values. 
  • ReferenceSlot => Each reference in EObjects has a ReferenceSlot with the EStructuralFeature and Instance pointed to by this ReferenceSlot

Reconstructors


As said the reconstructors take the .history and allow to 'build' a certain release of the Meta model and instance model. The Migrator uses the EcoreForwardReconstructor (As models typically age and need to be migrated forward).

Now the EcoreForwardReconstructor extends the CompositeReconstructorBase which delegates the reconstruction to one or more reconstructors declared with it. This is the typical delegation pattern to allow the reconstruction process to be extended. This is also exactly the way the Migrator works. The Migrator adds the MigrationReconstructor to the EcoreForwardReconstructor so delegation happens when needed.

As we will see in the Reconstruction Process, at some point in the reconstruction we will hit a Change. This definition knows many forms. (Many types of changes). In order to act appropriately on the Change type, the reconstructor typically implements a model object Switch.

In the Edapt case the History code generation produced the HistorySwitch which is extended by the various reconstructors to perform the appropriate action.

The MigrationReconstructorSwitch for example deals with specific Change implementations like an OperationChange and a MigrationChange to add or delete from the MMMeta or MMModel.


Mapping and resolving

Whenever a Migration kicks in it will create a Mapping instance which is initialized through all reconstructors and delegated reconstructors through the init(...) method of an reconstructor.  

The Mapping contains a TwoWayIdentityHashMap for mapping EObjects to each other.

One of the usages is to map Change elements to the created equivalent (The Ecore Metamodel) in memory of these Change elements.

In this case the history model is visited, starting with the initial Release. From this Release, the Ecore metamodel in the form of one or more EPackages is gradually build up to be the first release of the as intended by the history. (With corresponding nsURIs).

Then in subsequent releases and underlying changes, the reference EPackage is adapted gradually. When migrating the actual model instances (Which is only applicable for some changes), the model instance references to the Ecore model artifacts (EClass, EReference, EAttribute) are resolved from this very same mapping. It is therefor absolutely key, that the MMMeta and MMModel are loaded with the same EPackage from the 'extent'. 

Example:

When adding a new feature to a Class, in the mapping the target element from the Change
is looked up. and the new feature is added to the EClass (in the mapping). 

With the Mapping utility there is a  ResolverBase class to resolve elements from the mapping.  The ResolverBase has a special method named:

copyResolve

What it does is for an OperationChange in the History is to perform a copy of the model element but resolving from the mapping at the same time. It descends the hierarchy of features and resolves when an EClass package is of type ECorePackage.

Effectively what this means is that when the OperationChange is a  metamodel definition, the resolver starts resolving from the mapping, making sure the constructed Ecore metamodel is used.

Converters


The are two converters. 
  • ForwardConverter => Converts a ResourceSet to a MMModel
  • BackwardConverter => Converts from a MMModel to a ResourceSet
ForwardConverter

The Model model is populated in the order.

initElements(); (EClassifiers etc..).
initProperties() (EAttribute => AttributeSlot, EReference => Slot / ReferenceSlot).
initResources()

BackwardConverter

The ResourceSet and it's Resources are loaded in the order. 

initObjects(model);
ResourceSet resourceSet = initResources(model);
initProperties(model);

Persistence


Persistence is handled  through a utility class named Persistency. This utility is specialized to deal with the situation whereby loading and saving of Model instances respects the metamodel version for which the model should be loaded/saved.

One aspects of dealing with XMI serialized model is the potential dynamic nature of a Resource load implementation.  EMF support the dynamic creation of an EPackage based on the schemalocation attribute. The schemalocation attribute will potentially point to an instance of the .ecore which is constructed on a certain release of the history, so loading the resource, will auto-create an EPackage for that release. An EObject will have an eClass with parent EPackage for a certain Release of the history.

This is important, as various reflective functions which act on the EPackage should be acting on the exact intended EPackage 'version'.  I ran into this, when trying to copy a loaded model, with ECoreUtil (To load a copy in another Resource). The model had an EPackage which corresponded to the latest release, while the model itself was serialized with a previous release.

Edapt has encountered this issue and deals with this in the following manner; 


When loading a model it provides the EPackage to use from the MMMeta instance (See MMMeta Instance creation).  The EPackage is mapped to the nsURI of the model in the EPackageRegistry of the ResourceSet, so when the resource is loaded, it consults the EPackageRegistry and uses the EPackage instead of dynamically loading the EPackage.
Reconstruction Process
The reconstruction process 'visits' the History model hierarchy, it has hooks for the start and end of Releases and Changes.

When descending the history to the intended release, the reconstructor will delegate call change (CompositeReconstructor) which call the MigrationReconstructorSwitch, which 'switches' the Change. 

 
At a determined point the migration reconstructor loads the 'Model' model and a 'MetaModel' instance. This happens when the end of a Release is reached which is not the targeted release.

If the change is one of the types CompositeChangeMigrationChange or  InitializerChange, then the change also reconstructs the children of the specialized Change instances by the ForwardReconstrutor.

The Reconstrucion process can be represented as:

startHistory History
    startRelease Release
        for Release.changes()
            startChange Change
                startChange (CompositeReconstructor).
                switch Change
                    CompositeChange
                    MigrationChange
                    InitializerChange                 
            endChange change
    endRelease Release =>  (If the Release if the original release, load the model, see MigrationReconstructor )             
endHistory History


Persistence.saveModel();


The switchs process the following Change types:
 

caseAdd()
caseCreate()
caseDelete()
caseMove()
caseRemove()
caseSet()

caseMigrationChange()
caseOperationChange()


Conclusion


The Edapt Migrator and it's concepts have no more secrets! We explored the concepts and how they work. The migration process which couples Meta and Model is quiet impressive. In subsequent posts on Edapt I will elaborate on how to work with a non XMI Resource based Persistency, for example CDO. 

Friday, June 28, 2013

Cheating on Bucky & Is that a component behind that tree?

Cheating on Bucky

How do you go about learning something new? The intuitive approach is to read all you can grab about a topic. I tend to do this myself. In the case of buckminster the newbie is for sure not left in the dark. There is a free downloadable book and the wiki is undoubtedly one of the better wiki's covering an Eclipse technology topic.

Personally  I tend to get impatient to get going after some reading, and want to try out and get going for my needs. This is a pitfall, which can lead to long lasting trial-and-error sessions. One should really be able to recognize trial-and-error, step-back and say...oh now... I need to learn this thing before spending more time fiddling around. In the case of learning Bucky, this is exactly what happend, so I decided to create a Cheat Sheet, which would present the big bucky picture and at the same time hold sufficient details. As we all know, when writing and drawing, the best learning happens. The brain sucks it in like a sponge!

The first version can be found here , it's not been reviewed by a Buckminster authority, so please use it at your own risk!

Is that a component behind that tree?

Wow, what a relieve. I feel I master buckminster, and ready to build all components in the world. The actual setup of buckminster artifacts like cquery, rmap  etc... for NetXStudio , came with a different challenge, more precisely setting up the .rmap to locate components and getting the correct locator URL    came with surprises, I would like to share here.

This is my 3.x shopping list:

- Eclipse Platform + RCP ( I need the .ide plugin for various reasons).
- Eclipse EMF 2.something.
- Eclipse EMF/CDO 4.2 (And actually 4.0 first).
- Eclipse GEF/Draw2d/Zest
- Eclipse Nebula widgets.
- Eclipse Xtext 2.something.
- Eclipse Xpand, Xtend and MWE, tricky stuff, as these components are getting old..
- Google Collect/Inject
- Apache various libs, the usual suspects... (I am sure you know them).
- Apache POI for reading/writing certain spreadsheet formats.
- Javax stuff, like persistence API.
- SWT Chart, a nice and well documented charting widget.
-.... the ones I can't think of right now.

Here started my venture, I needed to walk through the forest of projects publishing there components, and pick the ones I needed. I also need to pick a component reader. Which is luckily very often p2 so it was a quick win. One question pops up all the time, why don't I download this thingy, push it up my Git repo, and then resolve it from my workspace? I could do that for the whole component list, above, but obviously this has some drawbacks:

  • I would keep a shadow of already managed and published components. 
  • I won't be continuously integrating. (Do I actually really want that?). 
So resisting the "self-push-into-github-components-I-need-temptation", I took out my torch, put on a helmet and went URL hunting. ( I recommend to bring a large cup of Cappuccino and don't forget to let friends and family know, you will be away for a while). 

Resolving Platform

For the platform, I found this:

http://wiki.eclipse.org/Eclipse_Project_Update_Sites

It has P2 sites for older releases, and the latest and greatest in flavour Milestone, Integration, Nightly
Note, not to confuse the Simultaneous releases hold the p2 repositories which bundle many Eclipse components in one single well-known p2 (i.e Juno, Indigo etc..).

Resolving EMF

Next up is EMF. Now EMF is rock solid. With great amusement, I follow the occasional attempt on the EMF forum to talk "it" into incorrect behavior. This usually fails, and rarely a fix is needed.
But how about that p2 URL, well sofar the main p2 advised from the download page is this:

http://download.eclipse.org/modeling/emf/updates/releases/

Now that's weird, pasting this p2 link into the IDE, I only see older EMF releases. In the case of EMF, Id' like to get the latest stable, let's say 2.8....but where is the p2 URL? I can download it as a whole, but no p2 can be located.... Perhaps I can tell bucky to link to the p2 in .zip format with this:

I dug around and wondered, with CI, there must be tons of projects requiring EMF, so how do they do it. The .rmap from CDO gave it away, here is the URL they use:

http://download.eclipse.org/modeling/emf/emf/updates/

This can be used in p2 in the IDE without problems. Now notice the fact that emf fragment is stated twice in the URL Where is this documented...? No answers for now, the URL can be specified in more granular form, by appending a release i.e "2.9-I-build" this would give you that specific integration build. At this very moment, I haven't figured out what the pattern is. Appending 2.8.1 for example is not a valid p2 URL. 


Resolving EMF Compare

Next up is the EMF Compare, now this is used by xtext to diff models, so depending on the Xtext version, there will be a dependency on EMF Compare version x.x. In my case, this was 1.2, unfortunately the packaging in 2.0 changed, so simply pointing to 2.0 Update site, didn't work. I had to point to an older release, luckily available in p2 format.

This is the URL, I ended up using:

Resolving M2T (Xpand/Xtend)

These buggers are available from the Model 2 Text project. In my case, I need some older releases, which are archived and not available in p2. (The zipped p2 URL isn't accepted as a valida repo, when pointing directly). The archive can be extruded on the server, but in this case, I felt more confortable to import the projects in my workspace, and commit to Git. (Yes, in this case, I didn't resist self-managed).

Resolving Xtext

This was pretty easy, at least that's what I thought.... The Xtext URL for p2 is this:


this will get you the latest Xtext, which is fine, however the generated Xtext editor has optional dependencies which have to do with Xtext builders and code generators. It ends up needing (optionally) the JDT , ltk, emf.codegen and more...., which I don't want in a runtime. Although Buckminster will recognize these as optional, if resolved it will also try to resolve the dependencies from the children, which might not succeed. This is when an advisor node in the .cquery comes in handy. Just skip the child components from optional plugin-dependencies.

There is also a Composite Release URL, which seems to contain everything from Itemis.
This is the one:

http://download.eclipse.org/modeling/tmf/xtext/updates/composite/releases/

I haven't tried, it but I think with the p2 reader, it's possible to point to specific categories of the p2 repo.

Done! well that's what I thought

So finally the sweet smell of success, all my components are resolved, and no more little red dots in the !

Give me more! 


What is left is tons of wishes, How do I add my unit tests?, can I produce the javadoc?, I actually want to publish the help as wiki, html and pdf... All those cool things... I am sure I'll find solutions for all these challenges in the time to come.....


Monday, April 29, 2013

Eclipse Modeling Framework - UI E4

I want to learn Eclipse 4 or e4 in short and I want to learn about Xtend as well. Then I figured, why not take the existing generated Eclipse 3.x EMF Editor and migrate to e4. It likely needs to adapt the templates which generate the Editor, so here I can try and use Xtend.

Get the result (Sofar)!
  • Grab E4MF here  (It's not done,  please log a bug on github for enhancements). 
If you wonder what the Eclipse EMF Editor is, then you should read about it here, and I will tell as little as it allows you to generated a fully functional editor, for whatever defined EMF model.

The Conclusion

Migrating a fully functional multipage editor which has interacts with other views to e4 is not a trivial task. First of all it's about clean-up. Dependency Injection is very powerful the new e4 plartform provides a very usefull implementation for these concepts.

The Approach:

The e4 platform is really 'different' to say the least. It's been out for a couple of years now, and there is sufficient information around to learn from and try to actually achieve my goal, but still there are a some fundamentally different concepts.

I list a few of them here:
  1. An e4 UI application is first constructed as an application model. So here is an Application.e4xmi with it's own editor to define the application structure. 
  2. Dependency Injection is all over. Parts are POJO's so do not (or limit) extending or implementing interfaces. If services are needed, these will be injected. There are plenty of Examples.
  3. The application ui model and actual renderering is de-coupled. This means, that the rendering can be exchanged, from i.e. SWT to JavaFX or Swing or whatever. 
  4. all org.eclipse.ui.* plugins are deprecated as 3.x platform plugins. This is an important fact to consider in the migration. 

My approach is to take the EMF extended library example, generate the editor plugins. there is typically three of them.

org.eclipse.emf.examples.library.edit
org.eclipse.emf.examples.library.editor
org.eclipse.emf.examples.library.test

...and then make a copy of these, and make them working e4 editor. This will form the target to generate, so the templates which generate from the EMF .genmodel files.

.genmodel options

Now this is a bit tricky. The .genmodel file,  as various options, which will add/remove functionality from the .edit and .editor plugins. In my approach, I will use the default, and worry about the optional features later. I know, this 'could' get me in trouble, but I want to get started, and don't feel like figuring out, what is the most complete set to make the most functional EMF editor.

The only option I set here is to generate an RCP version of the editor.
This has some consequences as we will see later on.

Finding dependencies

We know for a pure e4 product, org.eclipse.ui plugins should not be used anymore . So we need an overview or map of where these dependencies exist. What I am going to do is remove these dependencies, see what breaks and replace it with the e4 alternative.

The generated editor has the following dependencies.

org.eclipse.emf.examples.library.editor
=> org.eclipse.core.runtime
=> org.eclipse.emf.examples.library.edit
=> org.eclipse.emf.ecore.xmi
=> org.eclipse.emf.edit.ui

The first two will need some rework, the latter two are not UI plugins so these are OK.

org.eclipse.emf.examples.library.edit
=> org.eclipse.core.runtime
=> org.eclipse.emf.edit
=> org.eclipse.emf.examples.library

After inspecting the dependencies, of org.eclipse.emf.edit, I am surprised to see there are no UI dependencies. The .edit plugin there for requires no adaptation, although it provides the base implementations for the command and adaptation patterns used in EMF.

org.eclipse.emf.edit.ui
=> org.eclipse.core.runtime
=> org.eclipse.ui.views
=> org.eclipse.ui.workbench
=> org.eclipse.emf.edit
=> org.eclipse.emf.common.ui
=> org.eclipse.core.resources (Optional)
=> org.eclipse.ui.ide (Optional)
=> org.eclipse.jface.text (Optional) 

So what needs to be adapted, as we stated earlier, the org.eclipse.ui.* plugins have been replaced in e4. In our case the following plugin dependencies are affected:
=> org.eclipse.ui.views
=> org.eclipse.ui.workbench
=> org.eclipse.ui.ide (Optional)

Also the EMF common.ui plugin as dependencies on the Eclipse 3.x UI framework.

org.eclipse.emf.common.ui 
=> org.eclipse.core.runtime
=> org.eclipse.ui
=> org.eclipse.emf.common
=> org.eclipse.core.resources (Optional)
=> org.eclipse.ui.ide (Optional)
=> org.eclipse.jface.text (Optional)

Here only org.eclipse.ui is a dependency we haven't seen before, so this will migrate as well.  
 
I conclude, that the base UI plugin from EMF will need to be reworked. I decide to clone EMF, rename this plugins (As they will need to co-exist if the EMF team wishes to pull from me). The new name is:

org.eclipse.e4mf.edit.ui
org.eclipse.e4mf.common.ui 

Now my instinct, would tell me to break the 3.x  dependencies here, and fix them with the e4 alternative, so I decide to do so, but before I go into, this I would like a basic EMF e4 editor working
so I start here:


org.eclipse.emf.examples.library.editor

The approach is to drill down top down, the application structure. So we start with the Application, then the Perspective, then editors and views, then Actions etc....

Step 1. Fix the dependencies, which have changed with the e4 versions of

org.eclipse.e4mf.edit.ui
org.eclipse.e4mf.common.ui

Step 2. Create an e4 Application Model.

File -> New -> Eclipse 4 -> Model -> New Application Model
This creates a file named Application.e4xmi

[TRICK] You can instruct e4 to load the application model from a specific location, with the following property in the Application extension definition:

 <property
               name="applicationXMI"
               value="org.eclipse.emf.examples.library.e4editor/xmi/Application.e4xmi">
</property>

Here we tell e4 to load the model from a subdir 'xmi' , this is also where we keep e4 fragments.

Step 3.  Populating Application Model

Here we migrate the generated EMF editor functionality into the application model.
In e4 We need a pespective stack and the actual perspective. 

PerspectiveStack => ID: org.eclipse.emf.examples.library.e4editor.perspectivestack.0 (Generated)
Perspective => ID: org.eclipse.emf.examples.library.e4editor.perspective.0

Now in the EMF editor, we have the actual model editor on the left, and the outline and properties editor on the right. So we need PartSashContainers, PartStacks and Parts for this. Here are the ID. 

PartSashContainer => ID: org.eclipse.emf.examples.library.e4editor.partsashcontainer
      PartStack => ID: org.eclipse.emf.examples.library.e4editor.partstack.editor
      PartSashContainer => ID: org.eclipse.emf.examples.library.e4editor.partsashcontainer.1
           PartStack => ID: org.eclipse.emf.examples.library.e4editor.partstack.0
                   Part => ID: org.eclipse.emf.examples.library.e4editor.part.1
           PartStack => ID: org.eclipse.emf.examples.library.e4editor.partstack.1
                   Part => ID:  org.eclipse.emf.examples.library.e4editor.part.2


As we will see later on, we would actually like a PartDescriptor for opening the editor, unfortunately as of writing , this is not supported.

Maximize and Minimize

The base functionality, doesn't have the maximize and minimize buttons for a PartStack.
Adding the plugin:  org.eclipse.e4.ui.workbench.addons.swt to the launch config. will make some additional e4 addons available. One of them is the MinMax addon.

[TRICK]
With e4 it's possible to show the runtime application model editor. This is especially usefull when
working with fragments and processors. (Application model contributions).

To do this, include the following plugins in the launch config or product or other build system.
  • org.eclipse.e4.tools.emf.liveeditor 
  • org.eclipse.e4.tools.emf.ui 
  • org.eclipse.e4.tools.emf.ui.script.js
  • (Additional required)
When available, the following key combo can be used ALT-SHIFT-F9 to bring up the liveeditor.
[BUG] on Macosx it's not working, with current (Kepler M6)
https://bugs.eclipse.org/bugs/show_bug.cgi?id=394503

Step 4. The EMF Editor

In e4, an editor is a POJO, so doesn't extend EditorPart, nor a MultiPageEditor Part.
The EMF editor has several functionalities. In order to rebuild them one by one, we first disable:
  • Doesn't extend MultiPageEditorPart 
  • Overriding methods. (We are a POJO!).
  • Implement various e4 concepts.
Constructing the UI

In e4 we can designate any method and add e4 lifecycle annotations like @PostConstruct which will call the method when reaching the time in a part or other UIElement lifecycle. On top we can add method arguments, which will be injected (if available) by the context.

For our EMF Editor it starts with the init method.

simply doing this, gives us a parent composite to add our views to.

@PostConstruct
public void init(Composite parent, ......[more arguments to follow]){
   createPages()
}

We call createPages(), this is normally called by our MultiPageEditorPart, as there is no equivalent for MPEP , we settle adapting the method createPages() to only create one page.

ViewerPane

Viewer panes have a title bar and functionality to maximize/restore the viewerpane.
The concept is however very tight to IWorkbenchPart and IWorkbenchPage. The widget is actually placed inside a so-called ViewerForm which allows control of layout and margins etc...
One other feature will loose from ViewerPane, is that the title is updated with the selected object
from the view pane.

For now we decide, not to migrate this concept and create the viewers directly under the parent composite.
 

Init the Editor input

the init(IEditorSite site, IEditorInput input) method for a 3.x should find it's equivalent
in e4.  The 3.x definiton in the plugin.xml starts with <extension point="org.eclipse.ui.editors">. It allows us to specify the implementation, an icon and a contributor to the class and more...

The EMF Editor in 3.x generates a content type parser and a model specific content type
which are used to respectively parse the content and associate a file type by extension with the EMF generated editor.

Unfortunately e4 currently doesn't have a part descriptor which resembles the 3.x equivalent. There is an MInputPart and it can be put in the Application Model, but it will be static. What we want is to query the framework for editor descriptors which match a criteria like a file name or protocol. This concept is recognized and named EditorPartDescriptor. An extension to the e4 Application model to support this is available in the simple ID demo (Tom Shindle).

Some learnings from the demo:
  • It shows how an input from one MPart (Navigator) is set to the context with context.set(IFile.class, f);  and later inject in a Handler's @Execute method arguments. The context in this case is the IEclipseContext. 
  • The contributed EditorPartDescriptors for .xml .text and .java editors are checked for their supported extensions. If a match is found the e4 CommandService is use to fire a command, which will involen the OpenEditor Handler, which then creates an MInputPart and sets the contributionURI according to the EditorPartDescriptor to make sure the correct editor is contributed and instantiated.
  •  It demonstrates how the input URI of the MInputPart is adapted to an IDocument which the editor can consume. The adaptation is done with a so-called ContextFunction. The adapter implements org.eclipse.e4.core.contexts.IContextFunction which is offered as an OSGI service.

[DECISION] We do not implement this concept, as it requires an extension to the e4 Application model. See further the "Open Handler" for the actual chosen implementation.

Modularity and Contributions in e4

We could add a model fragment, but this will also be static. It is intended to provide extensibility but does not substitute for the old extension point *.editors.

The best possible solution for now is:

1. Implement an Open Handler
2. Create an MInputPart Programmaticly
3. Set the Contribution URI to our own EMF Editor
4. Set the Input URI for the MInputPart
5. Activate the part with the EPartService

Dealing with Dialogs -getShell()

In 3.x we need do getSite().getShell() to get the active shell for the part. In the EMF editor this happens as well. The alternative in e4 is this.

    @Inject
    @Named(IServiceConstants.ACTIVE_SHELL)
    private Shell activeShell;

So replace all invokations of getSite().getShell() with activeShell

Dirtyness

In 3.x an EditPart implements ISaveablePart which is the interface for marking the editor dirty (Dirty meaning it's been edited and should be saved, the user is notified with an * next to the title of the part). In order for the workbench to know about a dirty editor, we had to call firePropertyChange(IEditorPart.PROP_DIRTY);

In e4 we have the this alternative. 

@Inject
private MDirtyable dirtyable;

Whenever a part is dirty we have to call

dirtyable.setDirty(false/true);

In the case of the EMF Editor, this happens on the CommandStack Listener and when the editor is saved.

Saving is done by implementing the @Persist annotation to several methods:

doSave(IProgressMonitor monitor)
doSaveAs() [TODO, how would this work? It needs to be bind to an action, perhaps needs @Execute instead of @Persist]


Adapters

The adapter concept is also supported by e4. In classical 3.x a part is consulted (adapted) for certain interfaces and returns an implementation if supported. For the EMF Editor the following is adapted:

  • IContentOutlinePage.class
  • IPropertySheetPage.class
Now we have a bit of an issue here as both these classes are not available as e4 implementations.
However considering the problem these classes try to solve. Like dealing with selection etc...It is very different in e4.

The editor also listens for part change, and activates the EMF editor whener the property page or outline becomes active and is related to the EMF Editor.

Properties

The Properties concept is not implemented in e4, as we aim for pure e4 (Not compat layer).
See https://bugs.eclipse.org/bugs/show_bug.cgi?id=404884

[DECISION] Defined MPart placeholder, for keybinding to work.

Outline

The outline concept is implemented pure e4 in the simpleIDE demo (Tom Shindl).

[TODO] consider adopting this concept.

[DECISION] Defined MPart placeholder, for keybinding to work. 

Step 5. Actionsets and Actions.

The 3.x. EMF editor generates two action sets. One for the editor and one for the model.

ActionsSet.1 
  • About
  • Open URI
  • Open
ActionSet.2
  • Model (New)

ActionSets in 3.x are used to group actions which belong to a certain task usually represented in a certain perspective. Binding of ActionSets and Perspectives is done with the extension org.eclipse.ui.actionSetPartAssociations

Now besides the fact that ActionSets are even deprecated in the 3.x platform (Use Commands and Handlers instead), the generated EMF Editor actually only has one perspective as an RCP app and doesn't bind the action sets to the defined perspective.

Other Actions

The EMF Editor also creates various menus programmatically and adds global actions to the 3.x ActionBarAdvisor. This is:

File Menu

File -> [FILE_START]
            New -> [MB_ADDITIONS]*
            ----- ID: org.eclipse.emf.examples.library.e4editor.menuseparator.file.additions
            [MB_ADDITIONS]
            -----
            Close
            Close All
            -----
            Save
            Save As
            Save All
            ------
            Quit
            [FILE_END]

* The contributions between brackets [...] are markers for dynamic insertion in 3.x. For e4, the insertion points are ID's of other items. Menu separators can be pre-inserted in the Application. Fragments can then contribute to these.

Example: In our case we define a menu separator:  
org.eclipse.emf.examples.library.e4editor.menuseparator.file.additions

Model fragments use these ID's to contribute 'before' or 'after' as we will see later on. 

Creating the structure in e4 Commands, Handlers

The equivalent for e4 is to simply add Commands, Handlers and Key bindings to the Application Model for most actions and inserting the Application Model contributions in the right place using ID's of UIModel Elements.

Command ID's

[BUG?] In the documentation, it is stated that commonly used commands should use the ID's as known in IWorkbenchCommandConstants. However using this in some cases, causes the a menu or toolbar not being shown. For example:

For the About command, we should use: "org.eclipse.ui.help.aboutAction". This causes the menu entry 'About' not to show.

Handlers

Note that the Handler implementations are interresting, as these would use DI to get relevant objects like the workbench, or the Active part etc....

Most of the Handlers are straight forward. We discuss here some of the specific ones.

OpenHandler

The Open handler re-uses the functionality already in the 3.x EMF Editor. Like the methods to open a dialog and select a file based on extensions.

The function is hower exposed to dependency injection with a HandlerSupport services made available with OSGI. The service provides facilities to open an Editor, open a File Dialog, respecting the EMF Model file extension and more. See HandlerSupport

[TODO] Considering the dynamic nature the e4 Application model, we can close the partstack holding the EMF editor. So fix the Open/New handlers to cope with an un-existing partstack.

Key Bindings

An initial e4 Application model doesn't have any binding context associated.
A default binding context hierarchy will be defined as:

Binding Context - Window and Dialog (Applies to both)
             |
              - Binding Context Window
              - Binding Context Dialog

Binding Tables

Here we bind specific keys with commands for a Window or Dialog contex. See further which keys are associated with a context (via the Binding Table).
 

The following key-bindings are specified in EMF Editor

M1 Ctrl Command
M2 Shift Shift
M3 Alt Alt
M4 Undefined Ctrl

Declarative:

M1+U => Open URI command
M1+O => Open command

Declarative through EABC (Implict by use of Platform Actions).

M1+W => Close
M1+M2+W => Close All
M1+S => Save
M1+M2+S=> Save All
M1+Q=>Quit

Dynamic Contributions, the IEditorActionBarContributor

The 3.x EMF Editor mimics the Menu and Toolbar structure of the Eclipse IDE. Actions which are specific for the IDE (New, Open, Open URI and all edit actions), are contributed by respectively in a declarative manner in plugin.xml and by an IEditorActionBarContributor.

The solution for e4 requires some rework, as the contribution paradigm is different for e4.

Contributing the equivalent of 3.x Actions in plugin.xml is done by creating  fragments, which is added to our application. There will be two fragments. One for the EMF Editor, and one contributed by org.eclipse.e4mf.edit.ui. The reason is that these actions could be contributed to an IDE instead of an RCP Application.

[TODO] The contribution however is hardcoded in the fragment to specific. 

Notes on position in list: (I state it here, as it was not documented in most tutorials).

first
index:$theindex$
before:$theotherelementsid$
after:$theotherelementsid$
 
For the Editor Contributor, as there is no EditorPartDescriptor with associated contributor, we need to mimic the functionality with dynamic contributions.

The contribution occurs only when the editor is active, for e4 our designated MPart for the editor is active. The contribution is part of both the generated editor and emf.edit.ui

To achieve this, we hook into the e4 Event system. The Editor will check if an activated part's ID is the EMF Editor. If so, it will set a context variable.

[BUG]Unfortunately there is a bug: https://bugs.eclipse.org/bugs/show_bug.cgi?id=400217

Edit Menu

The Edit menu structure is

Edit
Undo   => M1+Z
Redo   => M1+M2+Z
----
Cut      => M1+X
Copy   => M1+C
Paste   => M1+V
----
Delete => Del
Select All => M1+A
[ADD_EXT]
[EDIT_END]
[MB_ADDITIONS]


Model Menu

EXTLibrary Editor
----[settings]
----[actions]
New Child -> [Containment children for selection]
New Sibblings -> [Sibblings for selection] 
Validate
Control...
---- [additions] 
Load Resource...
--- [additions_end]Refresh
Show Properties
--- [ui-actions]

Migrating the Actions

Most of the actions are part of org.eclipse.e4mf.edit.ui and build on JFace IAction. The IAction and Action implementation as such are not incompatible to e4. However the e4 workbench doesn't accept IAction's. [There is the Compat layer, with MRenderedMenu, but we go for pure e4 here]

EXTLibraryModelWizard

Fixing the wizard is required, although wizards are JFace only implementation, which is not exactly true. The generated Model Wizard for the EMF Editor also requires the implementation of an INewWizard.

Now why is this in the first place? Well this is really to contribute to the wizard to the Workspace
when running in IDE mode.

[DECISION] Remove 'implement INewWizard'

As a consequence, we also need to fix how the editor will be openend, but as we refactored this for the OpenHandler into a HandlerSupport Service, we simply call this method and we have no dependency on the 3.x IWorkbench.

Additionally, we would like out Model Wizard to be part of the injection context.
to achieve this, we simply add the following annotation to the class definition.

@Creatable

Now, a new instance will be created by the e4 DI, whenever we refer to the EXTLibraryModelWizard class in our Handler constructor for the 'New' command.

org.eclipse.e4mf.common.ui 

Step 1. Renaming the packages.
Step 2. I remove the dependency on org.eclipse.ui and org.eclipse.ui.ide, which of course breaks a lot of stuff. 

org.eclipse.ui
=> org.eclipse.swt
=> org.eclipse.jface
=> org.eclipse.ui.workbench

.swt and .jface are re-exported, which causes the common.ui to break.
Now, as rendering is decoupled from the UI model and .swt and .jface on top, it makes sense to let the current common.ui plugin to be just one of this rendering implementation. Later on, we could have a common plugin, which is alternative to SWT.  For sake of not over-complicating, I don't separate the UI model part from the rendering (SWT) just yet, but it would be required. 

So, I add the following dependencies:
=> org.eclipse.swt
=> org.eclipse.jface

Looking I wat is not resolved, this is more or less what I expect.
Things like EditorPart, Memento, PlatformUI, AbstractUIPlugin etc... These 'services' are all done differently in e4, so I get to work on these one by one :-)

Step 3. EclipseUIPlugin =>

Now this extends the AbstractUIPlugin, for which services in e4 offer similar functionality for Preferences, Dialog settings and accessing Resources like images.

[DECISION] For now it's best to let EclipseUIPlugin extend Plugin instead of AbstractUIPlugin. This means afore mentioned services will need to be provided the e4 way.

Step 4. Diagnostic Component =>

This class requires the Shared images normally available from the PlatformUI. We don't have the PlatformUI in e4, so this should be migrated by using an inject resource service.

After some research, the 3.x SharedImages are not Exposed as resources with the IResourcePool concept of e4. (part of tools.services). For EMF, I decide to create such an Resource Provider, or more explicitly a provider for the workbench images.

Note that for 3.x the images are registered on an ImageRegistery with the class WorkbenchImages.
The actual images are stored in org.eclipse.ui.

See this bug for the solution. : https://bugs.eclipse.org/bugs/show_bug.cgi?id=404727


Workspace Stuff =>

One of the functionalities of EMF Editors, is the ability to interact with the workspace. The workspace is

.....TODO Continue migration of emf.common.ui


org.eclipse.e4mf.edit.ui



Step 1. Renaming the packages
Step 2. Extended Image Registry

Fix the fall back to PlatformUI for getting images.

Wednesday, April 3, 2013

Eclipse 4 Injecting a Resource Pool

Objective: Migrate of Eclipse 3.x shared images to e4

Requirements:
e4 Tooling (Eclipse Download)
e4 Tooling (Vogella Download) 

Example: The example can be obtained here
Level: Intermediate Basic e4, Understanding of OSGI and DI in e4, General Eclipse, RCP experience

Everything about Eclipse 4 is different than programming against the Eclipse 3.x. One example of this is about obtaining resources like images, colors and fonts.  Eclipse 4 is very good in exposing OSGI Services through Dependency injection and a service for getting resources is already defined in the e4 tooling.

To make this very concrete, here is an example of how to register resources which can be obtained through a so called IResourcePool.

What I needed was access to the shared images from the 3.x framework.
Typically these resources would be obtained with

ISharedImages sharedImages = PlatformUI.getWorkbench().getSharedImages();
Image img = sharedImages.getImage(ISharedImages.IMG_OBJS_ERROR_TSK);

In e4 there is no PlatformUI singleton and as far as I investigated the images in IShatedImages are not available if pure e4 is chosen. So how do we deal with this? Well it turns out, the e4 tooling has a service which can be implemted to make resources available. How does it work?

The solution is like this:

First, define an OSGI service which looks like this:


<?xml version="1.0" encoding="UTF-8"?>
<scr:component xmlns:scr="http://www.osgi.org/xmlns/scr/v1.1.0" immediate="true" name="org.eclipse.e4.tools.resources.workbenchresourcess">
   <implementation class="org.eclipse.e4.tools.resources.WorkbenchResourceProvider"/>
      <service>
      <provide interface="org.eclipse.e4.tools.services.IResourceProviderService"/>
   </service>
   <properties entry="OSGI-INF/resources.properties"/>
</scr:component>


This file should be stored in a folder named OSGI-INF under the plugin root.
What it does.

  1. First it tells us that this service is an implementation of IResourceProviderService. This service is defined in org.eclipse.e4.tools.services so this plugin should be available through e4 tooling.
  2. The implementation of the service is named WorkbenchResourceProvider and looks like below. This class extends a convenient implementation, also from the e4 tooling named BasicResourceProvider. Our implementation only needs to define keys (static strings) which the service will use to find the resources. 
  3. The resources are found by binding the keys to a location in a file named resources.properties, which is also defined in the OSGI component. 
WorkbenchResourceProvider (This an extract, the example includes all keys which are in ISharedImages)

public class WorkbenchResourceProvider extends BasicResourceProvider {
 
 /**
     * Identifies the error overlay image.
     * @since 3.4
     */
    public final static String IMG_DEC_FIELD_ERROR = "IMG_DEC_FIELD_ERROR"; //$NON-NLS-1$

    /**
     * Identifies the warning overlay image.
     * @since 3.4
     */
    public final static String IMG_DEC_FIELD_WARNING = "IMG_DEC_FIELD_WARNING"; //$NON-NLS-1$

    /**
     * Identifies the default image used for views.
     */
    public final static String IMG_DEF_VIEW = "IMG_DEF_VIEW"; //$NON-NLS-1$

resources.properties
IMG_OBJS_ERROR_TSK=/icons/full/obj16/error_tsk.gif
IMG_OBJS_INFO_TSK=/icons/full/obj16/info_tsk.gif
IMG_OBJS_WARN_TSK=/icons/full/obj16/warn_tsk.gif

The example includes all images which are packaged with the plugin org.eclipse.ui
 Note: Not all keys are implemented in the resource.properties file. Please add missing keys at will.

How to use this service:

  1. Include the resources plugin as part of an rcp product or launch configuration. 
  2. Make sure the plugin is autostarted and that the start level is 0 or 1 so the service is available before the app tries to access it. 
  3. To use it in a class do the following: 

 @Inject
IResourcePool poolOfResources;
img = poolOfResources.getImageUnchecked(WorkbenchResourceProvider.IMG_OBJS_ERROR_TSK);


Perhaps the implementation of the resource service could be smarter and find the images by types as explained here this is also how the 3.x images are registered, but for now this works very well.


The example can be obtained here

Have fun!

What if it doesn't work

Very likely the OSGI service is not available when consulted. Make sure the service is available.
Note: Services can be consulted from the OSGI console.


What is the future of this

I don't know, but I think the resources from 3.x should be available for e4 apps. Follow the bug here

Friday, March 29, 2013

Migrating Instiki to Rails 3.2

Objective: Migration of Instiki to Rails 3.2 

Instiki is a great wiki system, which I am using to drive the content of http://www.netxforge.com.  Instiki however is behind in respect of the Rails version it was created with. This is a problem for me, as I want to integrate instiki, into a broader Ruby On Rails application. 

This blog describes the steps I took to do it. The conversion is unfortunately not available as a download or on a repo somewhere, but I hope that the list of issues (And how I solved them) can help others to repeat it and make the upgrade available. 

The creator of Instiki, alias Distler has endorsed this initiative in the instiki forum.

So here are the steps: 

Warning: some of the stuff is perhaps a bit specific to my setup so be carefull. Also I haven't done everything yet, and some things don't work correctly. One example is the use of JavaScript. In rails 3, JQuery is favoured over Prototype.js etc... , the use of assets and separation of Javascript from erb templates is good practise. Some of this is still to be done. 


----- (I Recommend to start the server after each step, to get a feel of the progress)


Step 1. Use a rails 3.2 generated app to compare the current instiki and the rails 3.2 structure. We refer to this as the Template Rails App (TRA)

Step 1.1 Update the Gemfile (It now looks like this, see explanation in separate steps)

source "http://rubygems.org"

gem "rails", "=3.2.12"

# Gems used only for assets and not required
# in production environments by default.
group :assets do
  gem 'sass-rails',   '~> 3.2.3'
  gem 'coffee-rails', '~> 3.2.1'

  # See https://github.com/sstephenson/execjs#readme for more supported runtimes
  # gem 'therubyracer', :platforms => :ruby

  gem 'uglifier', '>= 1.0.3'
end

gem 'jquery-rails'

gem "sqlite3", :require => "sqlite3"
gem "itextomml", ">=1.4.10"
gem "rack", ">=1.4.5"
gem "mongrel", ">=1.2.0.pre2"
gem "rubyzip"
gem "RedCloth", ">=4.0.0"
gem "erubis"
gem "nokogiri"
gem "rake"
gem "rdoc"
gem "json"
gem "file_signature", :git => 'http://github.com/distler/file_signature.git'
gem "maruku", :git => 'http://github.com/distler/maruku.git', :branch => 'nokogiri'
# gem "mysql2"

Step 2. Replace the /script's content with the TRA's /scripts content, delete the old files from /script

Step 3. Migration of the /config folder

Step 3.1 Create application.rb (Rails 3.  has an application.rb file) and configure it in the steps below.
   
    Load the /lib folder  as this is turned off, edit/add these params.

config.autoload_paths << "#{Rails.root}/lib"
config.autoload_paths << "#{Rails.root}/lib/chunks"
    Do some java script setup, for the new rails 3 assets concept. Instiki should actually migrate away from scriptaculous.js and use JQuery
        
config.action_view.javascript_expansions[:legacy] = %w(prototype.js scriptaculous.js)

Step 3.3 copy in the environments/ and initializers/ from TRA     

The TRA application name will be different then what Instiki should be. The following file should be edited and the     first line should be renamed from 

"TemplateApp::Application.configure do" to "InstikiApp::Application.configure do"
 
config/environments/development.rb     
config/environments/test.rb     
config/environments/production.rb 


Step 3.4 move the original environment.rb out of the way (renamed it to environment.rb.backup), and copy the environment.rb from the template rails app   

Some migrations from the contents of this file.         

Rename the last line to InstikiApp::Application.initialize!     

require_dependency 'instiki_errors' => Moved this to a custom initializer named config/initializers/instiki_init.rb It looks like this: 
 
require 'instiki_errors'
require 'wiki_content'
        

require 'instiki_errors' # migrated from instiki environment.rb     
require 'wiki_content' # Needed to load properly         [TODO]     

rexml_versions => Not sure what to do with this. It scans various directories to get an REXML version.     
# Miscellaneous monkey patches (here be dragons ...)     

require 'caching_stuff'     
require 'logging_stuff'     require 'rack_stuff'                Note: Not using require_dependency, as this is undocumented (Rails internal) and for development only. Not really a requirement here     I believe. 

Step 3.4 Copy in boot.rb from TRA, remove preinitializer.rb which is a pre Rails 3 hack to get bundler working. 

See: http://gembundler.com/v1.3/rails23.html 

Step 3.5 root.rb     [Carefull] This route.rb is slightly specific to my application, but it includes the wiki routes, so you can extract the relevant ones:
 
[TODO] some routes don't work in the rails 3. instill, need to be fixed.
 
def connect_to_web(generic_path, generic_routing_options, *options)
  if defined? DEFAULT_WEB
    explicit_path = generic_path.gsub(/:web\/?/, '') # Strip the /:web
    explicit_routing_options = generic_routing_options.merge(:web => DEFAULT_WEB)
    match explicit_path, explicit_routing_options
  end

  match generic_path, generic_routing_options
# map.connect(generic_path, generic_routing_options)
end

# :id's can be arbitrary junk
id_regexp = /.+/

InstikiApp::Application.routes.draw do

# SEE:  http://yehudakatz.com/2009/12/26/the-rails-3-router-rack-it-up/

  root :to => 'public#page', :id => 'HomePage'

  # Wiki Admin:

  match 'create_system', :to => 'admin#create_system'
  match 'create_web', :to => 'admin#create_web'
  match 'delete_web', :to => 'admin#delete_web'
  match 'delete_files', :to => 'admin#delete_files'
  match 'web_list', :to => 'wiki#web_list'

  # Application
  match ':controller/:action(/:id)'

  # Wiki webs routing
  connect_to_web ':web/edit_web',  :to => 'admin#edit_web' #Edit an arbitrary web.
  connect_to_web ':web/remove_orphaned_pages',  :to => 'admin#remove_orphaned_pages' #Remove pages which are not referenced by any other page
  connect_to_web ':web/remove_orphaned_pages_in_category',  :to => 'admin#remove_orphaned_pages_in_category'
  connect_to_web ':web/file/delete/:id',  :to => 'file#delete', :constraints => {:id => /[-._\w]+/}, :id => nil
  connect_to_web ':web/files/pngs/:id',  :to => 'file#blahtex_png', :constraints => {:id => /[-._\w]+/}, :id => nil
  connect_to_web ':web/files/:id',  :to => 'file#file', :constraints => {:id => /[-._\w]+/}, :id => nil
  connect_to_web ':web/file_list/:sort_order',  :to => 'wiki#file_list', :sort_order => nil
  connect_to_web ':web/import/:id',  :to => 'file#import'
  connect_to_web ':web/login',  :to => 'wiki#login'
  connect_to_web ':web/web_list',  :to => 'wiki#web_list'
  connect_to_web ':web/show/diff/:id', :to => 'wiki#show', :mode => 'diff', :requirements => {:id => id_regexp}
  connect_to_web ':web/revision/diff/:id/:rev',  :to => 'wiki#revision', :mode => 'diff', :constraints => { :rev => /\d+/, :id => id_regexp}
  connect_to_web ':web/revision/:id/:rev',  :to => 'wiki#revision', :constraints => { :rev => /\d+/, :id => id_regexp}
  connect_to_web ':web/source/:id/:rev', :to => 'wiki#source', :constraints => { :rev => /\d+/, :id => id_regexp}
  connect_to_web ':web/list/:category',  :to => 'wiki#list', :constraints => { :category => /.*/}, :category => nil
  connect_to_web ':web/recently_revised/:category',  :to => 'wiki#recently_revised', :requirements => { :category => /.*/}, :category => nil
  connect_to_web ':web/:action/:id',  :to => 'wiki', :constraints => {:id => id_regexp}
  connect_to_web ':web/:action', :to =>  'wiki'
  connect_to_web ':web',  :to => 'wiki#index' 

Step 4 problem with plugin: protect_forms_from_spam, comment it out. Find an alternative

Step 5. Deal with assets (Stylesheets, Javascript, Images)

    Read this: http://guides.rubyonrails.org/asset_pipeline.html
   
    Assets should be pre-compiled with:
        bundle exec rake assets:precompile
    these will end up in the /public/assets/ folder

Step 5.1 Update the Gemfile to include:

    # Gems used only for assets and not required
    # in production environments by default.
    group :assets do
      gem 'sass-rails',   '~> 3.2.3'
      gem 'coffee-rails', '~> 3.2.1'

      # See https://github.com/sstephenson/execjs#readme for more supported runtimes
      # gem 'therubyracer', :platforms => :ruby

      gem 'uglifier', '>= 1.0.3'
    end

Step 5.2 Create asset folders
   
    /app/assets/stylsheet

        - copy in application.css from TRA
        - [OPTIONAL] rename .css to .css.erb to use assets in CSS for example: <%= asset_path 'someimage.png' %>        

    /app/assets/javascript

        - application.js is auto created, but we already have an application.js file with a bit of scripts in it. so copy the following lines
        from TRA and add them to the instiki application.js
       
        // This is a manifest file that'll be compiled into application.js, which will include all the files
        // listed below.
        //
        // Any JavaScript/Coffee file within this directory, lib/assets/javascripts, vendor/assets/javascripts,
        // or vendor/assets/javascripts of plugins, if any, can be referenced here using a relative path.
        //
        // It's not advisable to add code directly here, but if you do, it'll appear at the bottom of the
        // the compiled file.
        //
        // WARNING: THE FIRST BLANK LINE MARKS THE END OF WHAT'S TO BE PROCESSED, ANY BLANK LINE SHOULD
        // GO AFTER THE REQUIRES BELOW.
        //
        //= require jquery
        //= require jquery_ujs
        //= require_tree .

    /app/assets/images
        - use image_tag helper methods.

Step 5.3
   
    Copy in the assets from /public in the respective locations under /aspp/assets from the 2.x instiki

Step 6 Replace @controller with controller wherever this occurs in the controllers.

Step 7. Fix issues with rendering and default.layout/
    [DEBUG]: Render an action without the layout for troubleshooting: render :layout => false
    fix the layout content insertion point, was @content_for_layout versus modern yield. so => <%= yield %>

    Use: <%= debug params %>

Step 8. problem with sublayout: (Wiki source)
   
    Showing /Users/Christophe/Documents/Spaces/netxforge_aptana/com.netxforge.store/app/views/layouts/application.html.erb where line #46 raised:

    undefined method `sub_layout' for #<WikiController:0x007ffb07b5bf58>

Step 9. link_to_remote issues: , now workaround, not using the :update tag. [TODO]

    Currently URL's like this are generated, which embeds a js into the URL with onclick statement).

    <a onclick="new Ajax.Updater('intropage', '/public/page/features?menu=true&partial=true',
    {asynchronous:true, evalScripts:true}); return false;" href="#"></a>

    :update is not supported anymore in Rails 3. (AJAX call back to update a DOM id), see the following articles:

    http://www.simonecarletti.com/blog/2010/06/unobtrusive-javascript-technique/

    New approach is based on separation of the HTML and JS, so the JS should do the update.

Step 10. Issues with form_tag

    in template edit.html.erb, the form tag starts with '<%' erb shabang, but should be '<%='
    replacing this fixed the problem.

    See 3.0 Release notes, section 7.4.2.

    Note: The included javascript should be moved to the application.js or another .js under /app/assets/javascript

Step 11. ActiveModel::MassAssignmentSecurity erros on various model objects[SOLVED]
    because of this:
   
    https://gist.github.com/peternixey/1978249

    in various model objects, to fix these errors.

    Page =>  attr_accessible :locked_at, :locked_by, :name
    Revision=> attr_accessible :revised_at, :page, :content, :author
    .....

Step 12. Undefined method error in WikiContent[SOLVED]
    It turned out, that
    include ChunkManager (in wiki_content.rb) didbn't load properly, as
    ChunkManager has dependencies on various other classes in /chunks.
    Made sure these are loaded in application.rb (See Step 3.1)

    however this causes another issue:

Step 12.1 Error, when include /lib/chunks
    Expected .../lib/chunks/wiki.rb to define Wiki
    Actually the whole app now fails with different errors:

    This could be a conflict in naming of classes, as /chunks defines a wiki.rb
    See this post:
    http://stackoverflow.com/questions/10948779/expected-to-define-when-calling-class-inside-a-module

    fixed by:

    - renamed wiki to wiki_c and references to it.
    - removed call to html_safe in WikiContent.render! as this points to a method which
    will produce a ActiveSupport::SafeBuffer instance from the WikiContent, so not adhering to the instance type,
    this gived method_errors

Step 13. URL generator in wiki gives wrong urls for: [TODO]
    all Pages => /:web/list/HomePage (So the :page is appended and shou'd not).
    edit Web => /:web/edit_web/HomePage (:page appended, should not).

Step 14. Dealing with XML templates.
    renamed atom.rxml to atom.builder simply solved the problem.

Step 15. Grab a coffee, and reflect on your great achievements sofar :-)

Step 16. Warning: You don't need to install rails_xss as a plugin for Rails 3 and after. [TODO]
   
    http://simianarmy.com/post/11117853564/upgrading-to-rails-3-rails-xss
    - What to do with this? It's not clear to me what the new situation is.

Step 17. [2013-03-13 12:26:00] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true

    Caused by webrick server, don't worry about it.


Thursday, December 27, 2012

Rolling the Calendar

For NetXStudio , I needed a function which would split a period into sub-periods. The function needs to be generic, so I can pass in a Calendar field which would then split the Period object according to field value.

First some definitions:
  • The DateTimeRange object is a simple Java Object with set and get methods for two Date fields named 'begin' and 'end'. 
  • Day start is defined at 00:00h and the day ends at 23.59h (Yes there is one lost minute but OK for me). 
The Java Calendar  or better the GregorianCalendar is capable to roll forward and backward as a real Calendar would do. The method roll(...) is used for that. The Calendar can be forced into a specific position using the set(...) method.

My method to deal with at least Calendar.MONTH and Calendar.DAY_OF_MONTH ended using the method getActualMinimum(...) and getActualMaximum(...) to set the Calendar to the desired position. The desired positions are:
  • Calendar.MONTH => The first day of the current month. 
  • Calendar.DAY_OF_MONTH => The first hour of the day. 
Now to roll into these positions, I need to know the 'child' Calendar field, to be used in the getActualMin/Max() methods. By 'child' field I mean the following mapping:
  • Calendar.MONTH => Calendar.DAY_OF_MONTH
  • Calendar.DAY_OF_MONTH => Calendar.HOUR_OF_DAY
Calling getActualMinimum(Calendar.DAY_OF_MONTH) and setting this as the new Calendar position, would set the Calendar to the first day of the current month of the Calendar. Calling getActualMinimum(Calendar.HOUR_OF_DAY) and setting this on the Calendar would set it at the first hour of the day.

All fine... The Unit test on a 6 month period which would be asked to split it in first MONTH_OF_YEAR, and subsequently DAY_OF_MONTH generated the intended result nicely.

Full: From: 27-06-2012 @ 00:00 To: 27-12-2012 @ 00:00
 Month: From: 01-12-2012 @ 00:00 To: 27-12-2012 @ 23:59
   Day: From: 27-12-2012 @ 00:00 To: 27-12-2012 @ 23:59
   Day: From: 26-12-2012 @ 00:00 To: 26-12-2012 @ 23:59
   Day: From: 25-12-2012 @ 00:00 To: 25-12-2012 @ 23:59
   Day: From: 24-12-2012 @ 00:00 To: 24-12-2012 @ 23:59
   Day: From: 23-12-2012 @ 00:00 To: 23-12-2012 @ 23:59
   Day: From: 22-12-2012 @ 00:00 To: 22-12-2012 @ 23:59
   Day: From: 21-12-2012 @ 00:00 To: 21-12-2012 @ 23:59

Being confident about my code, I expanded it by adding an option to split the Period in weeks. This is were trouble started. First I had to pick the 'child' field for the day of the week for the 'parent' field  Calendar.WEEK_OF_YEAR.

I quickly choose to use Calendar.DAY_OF_WEEK for this, makes sense right? Well that's what I initially thought. The result for setting the Calendar to the getActualMinimum(Calendar.DAY_OF_WEEK) actually didn't set the Calendar position to what I expected. I took me a while to figure out why. The reason is best explained with the following diagram:




In the diagram, the Calendar is set to today (Which happens to be the publication date of this blog-post, but this purely a coincidence, trust me on this). Now calling getActualMinimum with my freshly developed method for Calendar.DAY_OF_MONTH and Calendar.HOUR_OF_DAY works as expected. the Calendar rolls to the intended position. For Calendar.DAY_OF_WEEK) however, it doesn't. It actually roles forward! What I really expected was the new Calendar position to be the previous Sunday at least. Which was not even correct, as it should be the Monday for my Local.

mmmh... It triggered my curiosity (Ok, first there was a few moments of programmers frustration..). Why does it do that. I read about Calendar and noticed a bit of documentation on Calendar field conflicts. Suddenly I realized I should not use the actualMinimum for rolling the week day. What I should use is getFirstDayOfWeek() and set this on the calendar.

Ok, but this means that depending on the field, my method would need to act differently, which was not my initial goal. Sofar I haven't found another solution with the current GregorianCalendar capabilities. I even had to implement a function getLastDayOfWeek(Calendar cal) to set the end boundary of the week.  I encourage any reader to comment and let me know a better solution. (Note: I know about date/time libraries, but without them, can you make this better?).

Here is the code for the final solution:

public List periods(DateTimeRange dtr, int calField) {

  boolean weekTreatment = false;

  int childField = -1;
  switch (calField) {
  case Calendar.MONTH: {
   childField = Calendar.DAY_OF_MONTH;
  }
   break;
  case Calendar.DAY_OF_MONTH: {
   childField = Calendar.HOUR_OF_DAY;
  }
   break;
  case Calendar.WEEK_OF_YEAR: {
   childField = Calendar.DAY_OF_WEEK;
   weekTreatment = true;
  }
   break;
  }

  List result = Lists.newArrayList();

  if (childField == -1) {
   result.add(dtr);
   return result;
  }

  final Calendar cal = GregorianCalendar.getInstance();
  cal.setTime(dtr.getEnd().toGregorianCalendar().getTime());

  // An end calendar to compare the calendar field, and not take the field
  // maximum but the value from the end calendar.
  final Calendar endCal = GregorianCalendar.getInstance();
  endCal.setTime(dtr.getEnd().toGregorianCalendar().getTime());

  // Go back in time and create a new DateTime Range.
  do {

   // Set the begin to the actual minimum and end to the actual
   // maximum, except at the start, where we keep the actual.
   // At the end, roll one beyond the minimum to set the new actual.
   if (cal.get(calField) != endCal.get(calField)) {
    if (weekTreatment) {
     // :-( there is no method to get the last day of week.
     cal.set(childField, getLastDayOfWeek(cal));

    } else {
     cal.set(childField, cal.getActualMaximum(childField));
    }

   }

   final Date end = cal.getTime();

   // Special Treatment for Week
   if (weekTreatment) {
    final int firstDayOfWeek = cal.getFirstDayOfWeek();
    cal.set(Calendar.DAY_OF_WEEK, firstDayOfWeek);
   } else {
    int minimum = cal.getActualMinimum(childField);
    cal.set(childField, minimum);
   }

   Date begin;
   if (cal.getTime().getTime() < dtr.getBegin().toGregorianCalendar()
     .getTimeInMillis()) {
    begin = this.fromXMLDate(dtr.getBegin());
   } else {
    begin = cal.getTime();
   }

   final DateTimeRange period = period(this.adjustToDayStart(begin),
     this.adjustToDayEnd(end));
   result.add(period);

   // Role back one more, so the new actual can be applied.
   // This will cause the
   cal.add(calField, -1);
  } while (cal.getTime().getTime() > dtr.getBegin().toGregorianCalendar()
    .getTimeInMillis());

  return result;

 }


Notes:

  1. Usage of Google Collect to produce collections.
  2. The DateTimeRange type holds a 'begin' and 'end' member fields. 
  3. The period(Date begin, Date end) is a factory for an instance of of type DateTimeRange. 
  4. The method ignores the beginning of the period, which precedes the intended boundary.
  5. The method getLastDayOfWeek(Calendar cal) looks like this:

public int getLastDayOfWeek(Calendar cal) {
  final int firstDayOfWeek = cal.getFirstDayOfWeek();

  final int lastDayOfWeek;
  if (firstDayOfWeek != 1) {
   lastDayOfWeek = firstDayOfWeek - 1; // One before the first day...
  } else {
   lastDayOfWeek = cal.getActualMaximum(Calendar.DAY_OF_WEEK); // Expect
  }
  return lastDayOfWeek;
 }