Tuesday, October 11, 2016

OTN Appreciation Day: Programmatically Dismiss Popup

A couple of years ago I blogged about fading user feedback.  Also Duncan Mills had a solution for this, as did Frank Nimphius.

Just recently I was triggered by a blogpost of Shay Schmeltzer that in ADF 12.2.1.1 this can be done completely different and 100% declarative. Where Shay's example is declarative, in this blogpost I describe how to do the same when this popup is created programmatically.

To explain this, first of all I need to show you how to invoke a popup programmatically. In order to do this we need to have a popup component defined on the page. To have access to the popup in java code, we need to bind the popup to a managed bean:

1:        <af:popup id="p1" animate="true"  binding="#{backingBeanScope.utilBean.dismissablePopup}" >  
2:        <af:panelGroupLayout id="pg1" layout="horizontal" halign="center" styleClass="AFStretchWidth">  
3:            <af:image source="#{resource['images:qual_approved_32_full.png']}" id="i1"/>  
4:            <af:outputFormatted value="I will dissappear in #{backingBeanScope.utilBean.seconds} seconds" id="of1"/>  
5:          </af:panelGroupLayout>  
6:        </af:popup>  

This managed bean (in backing bean scope) contains also the code to invoke the popup. This code is called by a couple of different buttons. Just for the sake of this example, I created 3 buttons that will show the popup for 1,2 or 3 seconds respectively. In a real life scenario, you would typically set the value based on, for instance, message severity, or maybe use some kind of user preference. Anyway, the buttons are on the page and call out to these methods in our managed bean.

1:              <af:gridRow marginTop="5px" height="auto" id="gr2">  
2:                <af:gridCell marginStart="5px" width="34%" id="gc4">  
3:                  <af:button text="Show Popup 1 second" id="b1"  
4:                        actionListener="#{backingBeanScope.utilBean.ShowPopupOneSeconds}"/>  
5:                </af:gridCell>  
6:                <af:gridCell marginStart="5px" width="33%" id="gc5">  
7:                  <af:button text="Show Popup 2 seconds" id="b2"  
8:                        actionListener="#{backingBeanScope.utilBean.ShowPopupTwoSeconds}"/>  
9:                </af:gridCell>  
10:                <af:gridCell marginStart="5px" width="33%" marginEnd="5px" id="gc6">  
11:                  <af:button text="Show Popup 3 seconds" id="b3"  
12:                        actionListener="#{backingBeanScope.utilBean.ShowPopupThreeSeconds}"/>  
13:                </af:gridCell>  
14:              </af:gridRow>  

In the managed bean, there are the 3 methods that are called by the buttons. In these beans, a member variable is used to set and hold the number of seconds that it takes before the popup disappears. It is very simple actually. There are 3 like the one below:

1:    public void ShowPopupOneSeconds(ActionEvent actionEvent) {  
2:      setSeconds(1);  
3:      ShowPopup();  
4:    }  

Finally, in the showPopup method, all is put together, so the popup will show, and after that, automatically disappear after the expected number of seconds:

1:    public void ShowPopup() {  
2:         RichPopup.PopupHints hints = new RichPopup.PopupHints();         
3:         this.getDismissablePopup().setAutoDismissalTimeout(this.seconds);        
4:         this.getDismissablePopup().show(hints);     
5:     }  

That is all. As of ADF 12.2.1.1 it is indeed very simple to make a popup disappear after some time.

Note: This blogpost is written as part of the OTN Appreciation Day Global Initiative. Thanks to all people at Oracle Technology Network, and the Oracle ACE Program for all the valuable work you do, and for helping the Oracle Community around the world to do their work.

Friday, November 27, 2015

ADF 12.x: Changing the List Of Values "No rows to display" text

In this post I describe how I implemented an interesting Use Case where the customer wanted to have an alternate message in the LOV when there is no data found. We all know how this can be changed in an af:table component by simply changing the emptyText property. In a List of Values, this works 'slightly' different.

Lets start with the basics

By default you will see the following List Of Values, telling you that no data was found:


However in the case of my customer, they were not happy with the text. So I had to come up with something else. That is interesting, because the ListOfValues is an ADF component in itself and the individual components inside the LOV cannot be manipulated by means of properties. I had to find a way to get access to the table component inside the List Of Values.
The way to do this has been described by Frank Nimphius, a long while ago, in his Forum Harvest posts. To be more specific, it is the harvest of November 2010, section "How-to define access keys to buttons displayed in an LOV dialog"

This really helped me to solve the issue and exactly describes what should happen at runtime.

  1. A user clicks the LOV button next to an Input List of Values component. In response to this, the LOV component launch event is fired, which we listen to in a managed bean.
  2. The launch event however is handled during the JSF Invoke Application phase, whereas the LOV displays during Render Response. Because of this, it is not possible to use a dialog launch listener to set the access keys for the buttons. Instead, the launch listener is used to invoke client side JavaScript.
  3. The client side JavaScript addresses an af:serverListener component to queue a custom event in ADF Faces. The custom event invokes a managed bean method that accesses the list of value component to eventually find the table component.

I started with the code sample as was provided in Franks post and was able to extend it for my needs. The trick is in lines 32-39:

1:    //method called from the server listener  
2:    public void changeLovInternals(ClientEvent ce) {  
3:      //get the LOV component binding reference  
4:      RichInputListOfValues lov = getListOfValues();  
5:      String id = lov.getClientId(fctx);  
6:      RichPopup popup = null;  
7:      String facetName = null;  
8:      UIComponent componentFacet = null;  
9:      Iterator facetNames = lov.getFacetNames();  
10:      while (facetNames.hasNext()) {  
11:        facetName = (String) facetNames.next();  
12:        if (facetName != null) {  
13:          componentFacet = lov.getFacet(facetName);  
14:          if (componentFacet != null && componentFacet instanceof RichPopup) {  
15:            popup = (RichPopup) componentFacet;  
16:            break;  
17:          }  
18:        }  
19:      }  
20:      RichDialog lovDialog = (RichDialog) popup.getChildren().get(0);  
21:      lovDialog.setCancelTextAndAccessKey("&Take me back");  
22:      lovDialog.setAffirmativeTextAndAccessKey("&Make my choice");  
23:      //refresh the LOV to show the buttons  
24:      AdfFacesContext.getCurrentInstance().addPartialTarget(lovDialog);  
25:      // I found that the dialog consists of a panelGridLayout with 2 gridrows  
26:      if (lovDialog.getChildCount() == 1 && lovDialog.getChildren().get(0) != null &&  
27:        lovDialog.getChildren().get(0) instanceof RichPanelGridLayout) {  
28:        RichPanelGridLayout panelGridLayout =  
29:          (RichPanelGridLayout) lovDialog.getChildren().get(0);  
30:        List uiCompList = panelGridLayout.getChildren();  
31:        // always gridrows and always two.  
32:        if (uiCompList.size() > 0) {  
33:          // the first one is the search panel  
34:          // the second contains the table  
35:          RichGridRow richGridRow = (RichGridRow) uiCompList.get(1);  
36:          RichGridCell richCell = (RichGridCell) richGridRow.getChildren().get(0);  
37:          RichTable theTable = (RichTable) richCell.getChildren().get(0);  
38:          theTable.setEmptyText("Really, this is something, but it works");  
39:          AdfFacesContext.getCurrentInstance().addPartialTarget(theTable);  
40:        }  
41:      }  
42:    }  

When you run the application now, the LOV will display the alternate text


Summary
If you know how to get into the internal structure of an inputListOfValues, or any other component, you will be able to manipulate these components. Note that there is no guarantee that these manipulation will work after a upgrade. Oracle can change the implementation. This is for instance what happened with the inputListOfValues when the panelGridComponent became available. The internal structure of the ListOfValues component switched from using PanelGroupLayout and PanelHeader to using the PanelGridLayout component.
The solution in this post is implemented in 12.1.3. I did not check if Oracle changed the inputListOfValues in 12.2.

Resources
The code for this post can be found here on GitHub.

Most credits go to Frank Nimphius for his Forum Harvest post

Tuesday, September 08, 2015

ADF 12c : Using Jasper Reports en JasperSoft Studio 6.1; What Libraries do you need?

Over the last couple of years, or better in the last decade I have implemented several reporting solutions with Jasper Reports in ADF. I did that in ADF 10g, ADF 11.1.1.x, ADF 11.1.2.x and ADF 12.1.x
I also used several version of Jasper Reports. There is a whole lot of documentation, blogposts and presentations available. So when today I got a request from one of my customers to make a setup for the implementation of Jasper Reports 6.1 in ADF 12.1.3 I did not expect any problems. Boy was I wrong.

Here is the Story
With all the knowledge from the past, I decided to follow the known steps.
1) Download iReport Designer,
2) Build a report in iReport
3) Create an ADF application
4) Add the necessary libraries to use the report
5) Call the report from a button via a Managed Bean

Step 1
In the past I used iReport designer to build the reports. When you go to the download site of iReport designer you now see an interesting message.


So I took this serious and decided not to use iReport Designer, but to use JasperSoft Studio. It proves to work pretty much the same, so I was able to create a simple sample report.

Step 2
The simple sample report uses a Data Adapter to an Oracle Database (HR) that can be created as in step 5 of this tutorial. Take special note of the classpath of Database Library Driver. I use the one that is shipped with JDeveloper. That one can be found in:
1:  <jdeveloperhome>/oracle_common/rda/da/lib/ojdbc14.jar  

Now the report can be created using a simple query, for instance

 select * from EMPLOYEES  
Or you can use this tutorial to build your own report.

Step 3 and 4 and 5
In the new ADF Application we need to make sure to add the jasperReports Library to your viewController project. I used jasperreports-6.1.0.jar

With that in place I can now use my code from the past that needs to be in the backing bean to call the report from a button. That code should still work and is exactly as I show here.
1:  public class ReportingBean {  
2:    public ReportingBean() {  
3:    }  
4:    public void startReport(ActionEvent actionEvent) {  
5:      // Add event code here...  
6:      String filepath = "C:/JDeveloper/mywork/reports/MyReports/";  
7:      String reportname = "dummy.txt";  
8:      InputStream is;  
9:      try {  
10:        OutputStream os = new FileOutputStream(new File(filepath + reportname));  
11:        Map parameters = new HashMap();  
12:        // parameters.put("P_DEPARTMENT_ID", getDepartmentId());  
13:        Connection conn = getConnectionDS("java:comp/env/jdbc/MYHRDS");  
14:        JasperReport jasperReport;  
15:        //Note; we have two options. EIther we use the .jasper file and run the compiled report  
16:        //    or we use the .jrxml file that needs to be compiled at runtime.  
17:        if(runNoComplile){       
18:        // from .jasper file, so without compiling        
19:         InputStream jasperStream = new FileInputStream(new     
20:                 File("C:/JDeveloper/mywork/reports/MyReports/EMP_A4.jasper"));  
21:         jasperReport=   (JasperReport) JRLoader.loadObject(jasperStream);  
22:        // end from .jasper  
23:        }  
24:        else{  
25:        // from jrxml, so we do a runtime compile  
26:         is = new FileInputStream(new  
27:             File("C:/JDeveloper/mywork/reports/MyReports/EMP_A4.jrxml"));  
28:         JasperDesign jasperDesign = JRXmlLoader.load(is);  
29:         jasperReport = JasperCompileManager.compileReport(jasperDesign);  
30:       // end from jrxml  
31:        }  
32:        JasperPrint jasperPrint = JasperFillManager.fillReport(jasperReport, parameters, conn);  
33:        JasperExportManager.exportReportToPdfStream(jasperPrint, os);  
34:        JasperViewer.viewReport(jasperPrint, false);  
35:      } catch (FileNotFoundException e) {  
36:        e.printStackTrace();  
37:      } catch (JRException e) {  
38:        e.printStackTrace();  
39:      } catch (NamingException e) {  
40:        e.printStackTrace();  
41:      } catch (SQLException e) {  
42:        e.printStackTrace();  
43:      }  
44:    }  
45:    public static Connection getConnectionDS(String dsName) throws NamingException, SQLException {  
46:      Connection connection = null;  
47:      try {  
48:       javax.naming.Context initialContext = new javax.naming.InitialContext();  
49:       javax.sql.DataSource dataSource =   
50:            (javax.sql.DataSource)initialContext.lookup(dsName);  
51:       connection = dataSource.getConnection();  
52:       } catch(Exception e){  
53:         e.printStackTrace();  
54:         //or handle more gracefully   
55:       }  
56:      return connection;  
57:    }  
58:  }  

The necessary imports are added to the class from the jasperreports-6.1.0.jar library, the class compiles, the project builds and the application runs. However, at runtime, you will probably run into trouble. You will see several ClassNotFound Exceptions. At least I did run into these. It took me a couple of hours before I was able to fix these issues. It all had to with the library dependencies in the ViewController project.

So what exactly should you add, and where can you find it
All that the documentation tells you is that you need to add the jasperreports-6.1.0.jar library, however there is more that should be added to make this all work.
From the past I know that we should add the following:


So I decided to add whatever I could find that resembles those libs that made my reports work in the past. In the iReport solution these libraries were located in:
1:  <reportshome>\iReport-4.0.1\ireport\modules\ext  

In the new situation, with JasperSoft Studio, I was unable to find these libraries (or the new versions of them) in the reportshome location. So where have they gone?

I finally managed to find them in the download of jasperReports-6.1.0-project.zip. If you download the file and unzip it you will find all necessary libraries in:
1:  jasperreports-6.1.0/lib  

You can now copy the libraries from there to a folder that you use to bundle those libraries into your ADF Project. So what I did was the following: I created a folder called:
1:  /extralibs/jasper  

And I copied the following libs to it:
  • groovy-all-2.0.1.jar
  • iText-2.1.7.js2.jar
  • jasperreports-6.1.0.jar
  • poi-3.10.1.jar
  • jfreechart-1.0.12.jar
  • jcommon-1.0.15.jar
  • ant-1.7.1.jar (not sure if I need this one)
This resembles pretty much what I used to have in previous versions of my reports implementation. I was convinced that it would work now, so I gave it a try. Almost everything worked...

Except for the report that needs to be compiled (line 29 in the code sample above). Too bad, compilation failed and I was almost giving up. After a lot of googling and trying, I finally found a solution: Apparently Jasper reports uses the ecj library to do the compilation of reports. This library happens to be available from the same location as the ones that I added previously. Once I added this library to my project, all worked perfectly. The image below shows what libs were added by me in order to make it work.


Final note
I added the libraries to the ViewController project via the Jdeveloper Library Dependencies. You can (or should) use Maven or ANT to manage the library dependencies, as this is much more flexible.

Thursday, September 03, 2015

IoT Hackathon Part IV : Using Web Services to send Sensordata

In the previous 3 posts, building towards the eProseed IoT Hackathon, I described how to setup your Raspberry Pi, and how to use the GrovePi sensors. The used example is a small weather-station that read temperature and humidity and shows the readings on a display. That is all very nice, however, the data remains local on the Raspberry Pi so there is nothing that we can do with this information from an 'enterprise' perspective. In this post I will explain how easy it is to send the data to whatever 'end point' by using a REST-JSON web-service.

The Database Tables
For this use case I decided I needed 2 tables. One to hold all my available sensors (yes, I know, I have only one) and one to store the measurements per sensor.


The Webservice

In order to store the data in the database tables I need to send it from the Raspberry Pi to the database. For that I use a simple REST/JSON webservice that has a POST method. I also implemented PUT, GET and DELETE methods, because they might come in handy later.

The Webserivce is described by the following WADL:
 <ns0:application xmlns:ns0="http://wadl.dev.java.net/2009/02">  
   <ns0:doc xmlns:ns1="http://jersey.java.net/" ns1:generatedBy="Jersey: 2.5.1 2014-01-02 13:43:00"/>  
   <ns0:doc xmlns:ns2="http://jersey.java.net/"  
        ns2:hint="This is simplified WADL with user and core resources only. To get full WADL with extended resources use the query parameter detail.  Link: http://yourhost.com:7101/IoTRestJsonService/resources/application.wadl?detail=true"/>  
   <ns0:grammars>  
     <ns0:include href="application.wadl/xsd0.xsd">  
       <ns0:doc title="Generated" xml:lang="en"/>  
     </ns0:include>  
   </ns0:grammars>  
   <ns0:resources base="http://yourhost.com:7101/IoTRestJsonService/resources/">  
     <ns0:resource path="iot">  
       <ns0:resource path="/sensordata">  
         <ns0:method id="createSensorData" name="POST">  
           <ns0:request>  
             <ns0:representation element="iotSensorData" mediaType="application/json"/>  
           </ns0:request>  
         </ns0:method>  
         <ns0:method id="updateSensorData" name="PUT">  
           <ns0:request>  
             <ns0:representation element="iotSensorData" mediaType="application/json"/>  
           </ns0:request>  
         </ns0:method>  
         <ns0:method id="findAllSensorData" name="GET">  
           <ns0:response>  
             <ns0:representation element="iotSensorData" mediaType="application/json"/>  
           </ns0:response>  
         </ns0:method>  
       </ns0:resource>  
       <ns0:resource path="/sensordata/{id}">  
         <ns0:param xmlns:xsd="http://www.w3.org/2001/XMLSchema" name="id" style="template" type="xsd:int"/>  
         <ns0:method id="deleteSensorData" name="DELETE"/>  
         <ns0:method id="getSensorDataById" name="GET">  
           <ns0:response>  
             <ns0:representation element="iotSensorData" mediaType="application/json"/>  
           </ns0:response>  
         </ns0:method>  
       </ns0:resource>  
     </ns0:resource>  
   </ns0:resources>  
 </ns0:application>  

The service can be tested from any REST client. You can use 'Test Webserice' from within JDeveloper, or use a tool such as postman to POST a test message.
Calling Webservices from Python

Database is in place, Webservice is up and running, so the only remaining thing is to call the service from my Python code. By now it starting to get pretty obvious that this also cannot be very difficult with Python. And indeed, it is very simple again. Python comes with two libraries that can be used for this purpose: The request library and the json library. When you use those two, it takes only a couple of minutes to implement the webservice call. The code sample below shows you how to add the import of the two libraries and then how to construct the request; The request of course needs a resource URL in our case the relevant part is'iot/sensordata'. We also need to tell the request that is a json type request. This is part of the header. Finally we need to construct our data string that contains the measurement data. Here we can construct the exact JSON string that is expected by the service. The data that is sent in the request is simply dumped as JSON string, by calling "json.dumps()". This is the exact same string as can be seen from the Postman screenshot, whit other data ofcourse.

Note that I use sensorid =1 because this is the unique identifier of this sensor. To be 100% dynamic we could also use the ip adres of the Raspberry Pi as sensorid. We could call out to http://httpbin.org/ip to get the ip adres. For now I keep it simple and stick with the hardcoded value of 1.

 # import for the ws  
 import requests  
 import json  
 # end import for the ws  

 # here we prepare and call the ws  
  url = "http://yourhost.com:7101/IoTRestJsonService/resources/iot/sensordata"  
  headers = {'Content-Type':'application/json'}  
  sensordata = {"dataDate":i.isoformat(),"dataType":"T", "dataValue":t,"dataValueNumber":t,"sensorid":"1" }  
  r=requests.post(url, data=json.dumps(sensordata),headers=headers)  
Finally I share the complete code of the Python script with you so you can see and learn how to do this.
 # grovepi_lcd_dht.py  
 #  
 # This is an project for using the Grove LCD Display and the Grove DHT Sensor from the GrovePi starter kit  
 #   
 # In this project, the Temperature and humidity from the DHT sensor is printed on the DHT sensor  
 from grovepi import *  
 from grove_rgb_lcd import *  
 import logging  
 import datetime  
 # for the ws  
 import requests  
 import json  
 # end for the ws  
 dht_sensor_port = 7          # Connect the DHt sensor to port 7  
 # lets log to file  
 logger = logging.getLogger('weather.logger')  
 logger.setLevel('DEBUG')  
 file_log_handler = logging.FileHandler('/home/pi/Desktop/Lucs_projects/weatherstation/weather.log')  
 logger.addHandler(file_log_handler)  
 stderr_log_handler = logging.StreamHandler()  
 logger.addHandler(stderr_log_handler)  
 formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')  
 file_log_handler.setFormatter(formatter)  
 stderr_log_handler.setFormatter(formatter)  
 while True:  
      try:  
           [ temp,hum ] = dht(dht_sensor_port,0)          #Get the temperature and Humidity from the DHT sensor  
           t = '{0:.2f}'.format(temp)  
           h = '{0:.2f}'.format(hum)  
           i = datetime.datetime.now()  
           logger.info("temp ="+ t + "C\thumidity ="+ h + "%")   
           setRGB(0,128,64)  
           setRGB(0,255,0)  
           setText("Temp:" + t + "C   " + "Humidity:" + h + "%")  
         # here we prepare and call the ws  
           url = "http://yourhost.com:7101/IoTRestJsonService/resources/iot/sensordata"  
           headers = {'Content-Type':'application/json'}  
         #prep the temp message  
           sensordata = {"dataDate":i.isoformat(), "dataType":"T", "dataValue":t,"dataValueNumber":t,"sensorid":"1"}  
           r=requests.post(url, data=json.dumps(sensordata),headers=headers)  
         #prep the humidity message  
           sensordata = {"dataDate":i.isoformat(), "dataType":"H", "dataValue":h,"dataValueNumber":h,"sensorid":"1"}  
           r=requests.post(url, data=json.dumps(sensordata),headers=headers)  
         # end ws call  
            time.sleep(60)                 
      except (IOError,TypeError) as e:  
           logger.error('Error')  
           logger.error(r)  
Note that, almost at the end of the script, I use a time.sleep(60). This actually puts the script to sleep for one minute. This means, that every minute the new data is measured and sent to the webservice and saved in the database. It is very simple to build an ADF Page to display the data in some graphs. That is beyond the scope of this blogpost, however, here is an image of the result.

Note
The core purpose of this post is to describe how to use webservices from python, to forward sensor data to a (cloud) server. In a later post I describe how to use the better suited MQTT protocol. An example of this is CloudMQTT. CloudMQTT are managed Mosquitto servers in the cloud.Mosquitto implements the MQ Telemetry Transport protocol, MQTT, which provides lightweight methods of carrying out messaging using a publish/subscribe message queueing model.

Resources
1) RESTful Web Service in JDeveloper 12c
2) Calling REST/JSON from Python

Monday, August 03, 2015

ADF 12.1.3 : Implementing Default Table Filter Values

In one of my projects I ran into a requirement where the end user needs to be presented with default values in the table filters. This sounds like it is a common requirement, which is easy to implement. However it proved to be not so common, as it is not in the documentation nor are there any Blogpost to be found that talk about this feature. In this blogpost I describe how to implement this.

The Use Case Explained
Users of the application would typically enter today's date in a table filter in order to get all data that is valid for today. They do this each and every time. In order to facilitate them I want to have the table filter pre-filled with today's date (at the moment of writing July 31st 2015).


So whenever the page is displayed, it should display 'today' in the table filter and execute the query accordingly. The problem is to get the value in the filter without the user typing it. Lets first take a look at how the ADF Search and Filters are implemented by the framework.

Implementation of Default ADF Table Filter explained

When you drag and drop a collection from the Data control and drop it as a Table with filtering enabled, several things happen.

The table component will have a 'filterModel' attribute (line 9) and the 'filterVisible' property (line 11) is set to true. The columns in the table have the 'filterable' property set to true (line 13).

1:          <af:table value="#{bindings.AllEmployees.collectionModel}" var="row"  
2:                   rows="#{bindings.AllEmployees.rangeSize}"  
3:                   emptyText="#{bindings.AllEmployees.viewable ? 'No data to display.' : 'Access Denied.'}"  
4:                   rowBandingInterval="0"  
5:                   selectedRowKeys="#{bindings.AllEmployees.collectionModel.selectedRow}"  
6:                   selectionListener="#{bindings.AllEmployees.collectionModel.makeCurrent}"  
7:                   rowSelection="single"  
8:                   fetchSize="#{bindings.AllEmployees.rangeSize}"  
9:                   filterModel="#{bindings.AllEmployeesQuery.queryDescriptor}"  
10:                   queryListener="#{bindings.AllEmployeesQuery.processQuery}"  
11:                   filterVisible="true" varStatus="vs" id="t2">  
12:                <af:column sortProperty="#{bindings.AllEmployees.hints.EmployeeId.name}"  
13:                      filterable="true"  
14:                      headerText="#{bindings.AllEmployees.hints.EmployeeId.label}"  
15:                      id="c5">  
16:                  <af:inputText value="#{row.bindings.EmployeeId.inputValue}"  
17:                         ....etc  

Also in the pageDefinition behind this page a 'searchRegion' executable was created. This 'searchRegion' is used by the table component for its filterModel and queryListener.
1:    <iterator Binds="AllEmployees" RangeSize="25" DataControl="HrServiceDataControl" id="AllEmployeesIterator"/>  
2:    <searchRegion Binds="AllEmployeesIterator" Criteria=""  
3:           Customizer="oracle.jbo.uicli.binding.JUSearchBindingCustomizer" id="AllEmployeesQuery"/>  

Finally note that according to the documentation, 'In addition any column that wants to support filtering must have filterable="true" set along with the sortyProperty="...". The "sortProperty" attribute is the key for the filter field in the model.'
When I run the page, filtering works out of the box. There is nothing extra that I need to do. However if I want to have programatic access to the filter in order to set default values there is a lot more to do. As a matter of fact, I need to have access to the queryDescriptor.

The Implementation of the Default Filter Values
To get access to the queryDescriptor, I have to create a managed bean definition and a Java class with a method where I can lookup the 'searchRegion' binding and work with it. The managed bean is defined in the taskflow config file.
1:    <managed-bean>  
2:     <managed-bean-name>EmpTable</managed-bean-name>  
3:     <managed-bean-class>com.blogspot.lucbors.tablefilter.view.beans.LucsTableBean</managed-bean-class>  
4:     <managed-bean-scope>pageFlow</managed-bean-scope>  

The Java Class, called LucsTableBean, needs at least one method to implement the functionality. In this method first I need to get the queryDescriptor from the 'searchRegion' binding. I know the name of that binding so I can easily look it up. Next I can get the 'FilterableQueryDescriptor' directly from that search binding. The method FilterableQueryDescriptor.getFilterConjunctionCriterion() returns the map of parameters involved in searching.
Unfortunately it’s not enough to to use this method to get the FilterConjunctionCriterion. I have to iterate over the ConjunctionCriterion and work with them one by one. I have to check which type of ConjunctionCriterion I get from the iterator as well, as there are two different ones:

  • AttributeCriterion
  • ConjunctionCriterion

Only the AttributeCriterion can be set, the ConjunctionCriterion represents a group of AttributeCriterion. For this use case I need to change the value of the 'HireDate'. Therefor I check if the current attribute is 'HireDate' and if so, I get to the point where I can assign the default value. After making the necessary changes I return this customized query descriptor in order for the table to use it.
So finally the Java Method for deriving the QueryDescriptor ends up to be like below, where
1) in line 2,3,4 I find the search binding and the queryDescriptor.
2) in lines 5-9 I find the List of criterion (if any).
3) in line 11-12 I set the value of Today in the CriterionAttribute, if it is the HireDate attribute.
4) Because I only want to do this on initial display (it is a default search) I set an indicator (in line 16) and proces the query automatically (line 19) with the default search criteria.
1:    public FilterableQueryDescriptor getCustomQueryDescriptor() {  
2:      String bindingEl = "#{bindings.AllEmployeesQuery}";  
3:      FacesCtrlSearchBinding sbinding = (FacesCtrlSearchBinding) JSFUtils.resolveExpression(bindingEl);  
4:      FilterableQueryDescriptor fqd = (FilterableQueryDescriptor) sbinding.getQueryDescriptor();  
5:      if (fqd != null && fqd.getFilterConjunctionCriterion() != null && isInitialQuery()) {  
6:        ConjunctionCriterion cc = fqd.getFilterConjunctionCriterion();  
7:        List<Criterion> lc = cc.getCriterionList();  
8:        for (Criterion c : lc) {  
9:          if (c instanceof AttributeCriterion) {  
10:            AttributeCriterion ac = (AttributeCriterion) c;  
11:            if ((ac.getAttribute().getName().equalsIgnoreCase("HireDate")) && (ac.getValue() == null)) {  
12:              ac.setValue(getToday());  // we need date without time so lets call getToday()  
13:            }  
14:          }  
15:        }  
16:        setInitialQuery(false);  
17:        RichTable tbl = getTable();  
18:        QueryEvent queryEvent = new QueryEvent(tbl, fqd);  
19:        sbinding.processQuery(queryEvent);  
20:      }  
21:      return fqd;  
22:    }  
23:    public Date getToday(){  
24:      Date today = new Date();      
25:      // Get Calendar object set to the date and time of the given Date object  
26:      Calendar cal = Calendar.getInstance();  
27:      cal.setTime(today);  
28:      // Set time fields to zero  
29:      cal.set(Calendar.HOUR_OF_DAY, 0);  
30:      cal.set(Calendar.MINUTE, 0);  
31:      cal.set(Calendar.SECOND, 0);  
32:      cal.set(Calendar.MILLISECOND, 0);  
33:      // Put it back in the Date object  
34:      today = cal.getTime();  
35:      return today;  
36:    }  

NOTE: I created a utility method 'getToday' in order to get a date without a time. Otherwise, due to the time component, the query return no results.

For the table to use this FilterableQueryDescriptor, I need to change the value of the filterModel attribute of the table component from the default:
1:  filterModel="#{bindings.AllEmployeesQuery.queryDescriptor}"  
to my custom one:
1:  filterModel="#{pageFlowScope.EmpTable.customQueryDescriptor}"  

I also changed to binding attribute for the table so I can work with table programatically as I do in line 18 of the getCustomQueryDescriptor() method. This all works like a charm.

Making it Generic
It is nice that I can now set a default value for the HireDate, but both the Attribute Name and the Search Value are hardcoded. Besides that, this only works for this one single table. The solution here is to make the managed Bean class more generic and configure it dynamically, so it can work for any table and any default search criteria.
Lets take a look at what things I want to configure dynamically:
  • First and for all I want to use this on multiple tables, so the name of the searchBinding must be dynamic
  • Next I want to be able to use multiple different default filter criteria for each different table so I need to configure the defaultFilterCriteria dynamic
To make this work is I have to implement these as properties in my 'LucsTableBean' class so I can configure them in my bean definition. So I create these two properties and generate the getters and setters for it.
1:    // The map with filterCriteria. Injected as managed property.  
2:    private Map<String, Object> defaultFilterCriteria;  
3:    // The name of the querybinding. Injected as managed property.  
4:    private String queryBindingName;  

Note that the 'defaultFilterCriteria' is a Map with a 'String' as key and an 'Object' as value. In that way i can put virtually everything in it. The trick now is to give the Key the same value as the Name of the Attribute in AttributeCriteria. In this way I can get the filter value from my defaultFilterCriteria map by calling the get on the map with the Name of the Attribute. I know it reads a bit difficult, but here is the code.
Where I previously used:
1:  if ((ac.getAttribute().getName().equalsIgnoreCase("HireDate"))  
I can now get the filterValue directly by using the following:
1:  // check if a default filter exists for this attribute  
2:  Object filter = defaultFilterCriteria.get(ac.getAttribute().getName());  
It will give me the Value that I want to use as default.

In the config file I can now set the values of the managed properties for queryBindingName and defaultFilterCriteria.
The value for the queryBindingName is 'AllEmployeesQuery' and for the defaultFilterCriteria I enter 'HireDate' as key, and '31/7/2015' as value. That should do the trick.
1:    <managed-bean>  
2:     <managed-bean-name>EmpTable</managed-bean-name>  
3:     <managed-bean-class>com.blogspot.lucbors.tablefilter.view.beans.LucsTableBean</managed-bean-class>  
4:     <managed-bean-scope>pageFlow</managed-bean-scope>  
5:     <managed-property>  
6:      <property-name>queryBindingName</property-name>  
7:      <value>AllEmployeesQuery</value>  
8:     </managed-property>  
9:      <managed-property>  
10:      <property-name>defaultFilterCriteria</property-name>        
11:     <map-entries>  
12:       <map-entry>  
13:        <key>HireDate</key>  
14:        <value>31/7/2015</value>  
15:       </map-entry>  
16:      </map-entries>  
17:     </managed-property>  
18:    </managed-bean>  

Now I am ready to change the method to work with those managed properties. First of all I need to lookup the search binding, which previously was hardcoded. It is rather simple to do that. Just change the hardcoded value into the managed property's value, and now I can get the FilterableQueryDescriptor from any search binding.
1:    public FilterableQueryDescriptor getCustomQueryDescriptor() {  
2:      String bindingEl = "#{bindings." + queryBindingName + "}";  
3:      FacesCtrlSearchBinding sbinding = (FacesCtrlSearchBinding) JSFUtils.resolveExpression(bindingEl);  
4:      FilterableQueryDescriptor fqd = (FilterableQueryDescriptor) sbinding.getQueryDescriptor();  
Now I need to figure out a way to work with the filtercriteria. I know it is a map, and I know that I will only use it if it contains values. Lets continue to the part where I am looping the List of Criterion. For each and every AttributeCriterion I want to check if there i a filterValue in the map, and if so, I want to apply that Value. In case of a Date I have to do some tricks because I cannot just parse a String into a Date. All other values (as far as I have tested) work fine.

So what is in the code below?
1) After checking if it is an AttributeCriterion, I use the 'Name' of the attribute to lookup the filtervalue in the map with defaultFilterCriteria (Line 5).
2) If the filter has a value, I use a little trick to find out if I am working with a Date. (lines 7-9)
3) If it is a Date, I need to format and parse it, before I can set it as filterValue (line13)
4) If it is not a Date, I simply use the value as is (line 19).
1:          for (Criterion c : lc) {  
2:            if (c instanceof AttributeCriterion) {  
3:              AttributeCriterion ac = (AttributeCriterion) c;  
4:              // check if a default filter exists for this attribute  
5:              Object filter = defaultFilterCriteria.get(ac.getAttribute().getName());  
6:              if (filter != null) {  
7:                // OK, this might not be optimal, but I try here to get the default component type  
8:                // if it is an input date, we need to parse the String value to a Date  
9:                if (ac.getComponentType(null) == AttributeDescriptor.ComponentType.inputDate) {  
10:                  SimpleDateFormat formatter = new SimpleDateFormat("dd/MM/yyyy");  
11:                  try {  
12:                      Date date = formatter.parse(filter.toString());  
13:                      ac.setValue(date);  
14:                    }  
15:                  } catch (ParseException e) {  
16:                    e.printStackTrace();  
17:                  }  
18:                } else {  
19:                  ac.setValue(filter);  
20:                }  
21:              }  
22:            }  
23:          }  
Now all is ready to use any kind of default filter value, on multiple tables. All is based on a 'simple' class and can be configured in the managed bean definition.

Before you start asking 'what about tomorrow?'
I was expecting the question, so let me answer it before you can ask: 'What about tomorrow?'. The way I implemented the defaultFilterValue for HireDate works for 'today' but not for 'tomorrow':
1:      <managed-property>  
2:      <property-name>defaultFilterCriteria</property-name>  
3:      <map-entries>       
4:       <map-entry>  
5:        <key>HireDate</key>  
6:        <value>31/7/2015</value>  
7:       </map-entry>  
8:      </map-entries>  
9:     </managed-property>  
So how can configure the value in such a way that I get the current date into the filter? It proved to be pretty simple.
The map is defined as a 'String, Object' type map, so I can actually put whatever I want in the value. So let's say that I want to have the HireDate's filter value to be 'TODAY'. Why not define it in exactly that way:
1:      <managed-property>  
2:      <property-name>defaultFilterCriteria</property-name>  
3:      <map-entries>       
4:       <map-entry>  
5:        <key>HireDate</key>  
6:        <value>TODAY</value>  
7:       </map-entry>  
8:      </map-entries>  
9:     </managed-property>  
Now in my class, 'defaultFilterCriteria.get(ac.getAttribute().getName())' will return TODAY for the HireDate attribute.
In that case I simply set the value of the filter to 'TODAYs' date (line 4,5). I could even make it work for YESTERDAY, TOMORROW, LASTYEAR and so on....
1:                if (ac.getComponentType(null) == AttributeDescriptor.ComponentType.inputDate) {  
2:                  SimpleDateFormat formatter = new SimpleDateFormat("dd/MM/yyyy");  
3:                  try {  
4:                    if (filter.toString().equalsIgnoreCase("TODAY")) {  
5:                      ac.setValue(getToday());  
6:                    } else {  
7:                      Date date = formatter.parse(filter.toString());  
8:                      ac.setValue(date);  
9:                    }  
10:                  } catch (ParseException e) {  
11:                    e.printStackTrace();  
12:                  }  
13:                } else {  
14:                  ac.setValue(filter);  
15:                }  

A Final Example
With all logic in place I will now show you how to work with two tables, that use their own instance of the LucsTableBean, and have their own specific managed properties. No need to write any Java code, I simply define the properties for the two managed bean instances in the config file.
1:    <managed-bean>  
2:      <managed-bean-name>EmpTable</managed-bean-name>  
3:      <managed-bean-class>com.blogspot.lucbors.tablefilter.view.beans.LucsTableBean</managed-bean-class>  
4:      <managed-bean-scope>pageFlow</managed-bean-scope>  
5:      <managed-property>  
6:        <property-name>queryBindingName</property-name>  
7:        <value>AllEmployeesQuery</value>  
8:      </managed-property>  
9:      <managed-property>  
10:        <property-name>defaultFilterCriteria</property-name>  
11:        <map-entries>  
12:          <map-entry>  
13:            <key>HireDate</key>  
14:            <value>TODAY</value>  
15:          </map-entry>  
16:        </map-entries>  
17:      </managed-property>  
18:    </managed-bean>  
19:    <managed-bean>  
20:      <managed-bean-name>AdminTable</managed-bean-name>  
21:      <managed-bean-class>com.blogspot.lucbors.tablefilter.view.beans.LucsTableBean</managed-bean-class>  
22:      <managed-bean-scope>pageFlow</managed-bean-scope>  
23:      <managed-property>  
24:        <property-name>queryBindingName</property-name>  
25:        <value>EmployeesInAdministrationDepartmentQuery</value>  
26:      </managed-property>  
27:      <managed-property>  
28:        <property-name>defaultFilterCriteria</property-name>  
29:        <map-entries>  
30:          <map-entry>  
31:            <key>HireDate</key>  
32:            <value>7/6/1994</value>  
33:          </map-entry>  
34:          <map-entry>  
35:            <key>FirstName</key>  
36:            <value>S</value>  
37:          </map-entry>  
38:        </map-entries>  
39:      </managed-property>  
40:    </managed-bean>  

And of course make sure that the tables have the 'binding' attribute and the 'filterModel' set to the appropriate values. When running the page with the two tables, you will see that each has it's own default filter values, exactly as were defined.

Summary
Working with filters is very valuable for end users. In order to it even more user friendly, we can supply them with default filter values. This functionality unfortunately is not out of the box, but after reading this blogpost, you now have a way to implement it. I use a 'simple' Java class with only one method. The Class can be used as a managed bean and it's properties can be set at the bean configuration. This enables me to work with a map with filterValues which can be configured for each individual bean instance. The difficulty lies in the ADF implementation of the FilterableQueryDescriptor and its criteria.
NOTE:The implementation of FilterableQueryDescriptor has changed from ADF 11.x to ADF 12.1.x For instance getFilterCriteria() has been deprecated. I did not test the solution proposed in this blogpost in 11.x, but I think I can safely assume that this is only working in 12.1.x and not in 11.

Resources
The code for this blogpost can be found on github.

1) ADF Table Component Documentation
2) Filter reset (12.1.x)
3) Date Range Filter (12.1.x)

Thursday, July 30, 2015

IoT Hackathon Part III : Some enhancements to the sensor example

In my previous post I showed you how to setup a simple weather-station using Raspberry Pi, GroovyPi sensors and Python. It worked very well, but there is definitely room for improvement. In this short post I describe some of these improvements. First you will learn how to start the weather-station when you reboot the Pi. Next I will show you how to create some decent log info.

Autostart
The weather-station works like a charm, but if the Raspi is rebooted, you need to manually restart the python script. That is not the way I want this to work, I want the weather-station start on reboot. Actually this is very simple. The Raspberry Pi uses Linux as OS and thus we can use the crontab to schedule when to start tasks. With the @reboot directive, a task starts on reboot. The only thin we need to do is to add a line to the crontab that tells that we want to start our weather-station on each and every reboot. You can open te crontab for editing by issuing the following command:
 $ sudo crontab -e  
Next you simply add the following line of code to the crontab and you are good to go.
 @reboot python /home/pi/Desktop/Lucs_projects/weatherstation/weatherstation.py &  
The “&” at the end of the line means the command is run in the background and it won’t stop the system booting up. It is as simple as that.

Logging
In my initial setup I used a print statement to send the reading to the console.
 print "temp =", temp, "C\thumadity =", hum,"%"  
However, when the weather-station is started on reboot, instead of from the command prompt, there is no console and there is now way that we can see the data. For that reason I decided to add some logging to the Python script. For this we can use the logging library. This enables you to create a logger, create a loghandler to write info to file and console, and also to add some formatting to your logstatements. All of this is explained in the Python Documentation. So first import the logging library, next create a logger and some loghandlers, optionally add some formatting and your logging is ready to use.
 import logging  
 import datetime
  
 # lets create a logger 
 logger = logging.getLogger('weather.logger')  
 logger.setLevel('DEBUG')

 # create a log handler to log to file   
 file_log_handler = logging.FileHandler('/home/pi/Desktop/Lucs_projects/weatherstation/weather.log')  
 logger.addHandler(file_log_handler)  

 # create a log handler to log to the console
 stderr_log_handler = logging.StreamHandler()  
 logger.addHandler(stderr_log_handler)  

 #now add some formatting (note the import of datetime is required)
 formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')  
 file_log_handler.setFormatter(formatter)  
 stderr_log_handler.setFormatter(formatter)  

 #Now we can write use the logger  
 logger.info("temp ="+ t + "C\thumidity ="+ h + "%")   
When the weather-station is started on reboot, the log statements can be found in 'weather.log'. When you start from command-line, log statements are visible in both, the console and the 'weather.log' file.

Resources
1) Start at reboot
2) Python Logging

Tuesday, June 30, 2015

ODTUG KScope15: One week in a nutshell

Last week I visited ODTUG KScope15 in Hollywood Florida. In this post I will share my findings and hope that you can benefit from it.
My general impression of the conference is that it was to hot for me in Florida. At least to hot for a conference. If you can sit on the beach whole day long, it is a great place to be, but hey, I had to work.... From a content perspective, there was a whole lot of Mobile and Cloud, and just a tiny bit of ADF. We need to work on this for next year as per request of the attendees who really were asking for more ADF content.

Anyway, here is my week in a nutshell.
Saturday
Not much to do after an 18 hour trip. So when I arrived at the hotel the only thing to do was enjoy the view and get some Cuban food. What better place to do that then in Southern Florida. The food was good and I went there again during the week.


Sunday
The Sunday symposium was a very good one from an Oracle ADF Developer Perspective. The Sunday Symposium took a deep look at the impact of the cloud's platform on the role and capabilities of application developers. Oracle showed the latest and upcoming products for web and mobile application development and deployment in the cloud. There was also room for empowering business users with development capabilities through the cloud, with products such as MAX and ABCS (see later). There was a demo of MAX, aka Mobile App Accelerator. MAX is a part of Oracle Mobile Cloud Service in a demo. where a business user can actually create real mobile apps without coding.

The final session of the day was really worthwhile. In this session Brian Fry showed Oracle Application Builder Cloud (it is as easy as ABC) for the very first time publicly. Oracle ABCS enables you to drag-and-drop your way to enterprise-ready JavaScript and HTML web applications.

More details on the Sunday Symposium can be found here.

Monday
On Monday I attended an IoT hands-on lab. It used an APEX front end to show results, but for me the main content (IoT) was more important. I actually built my own thing and connected it to an Oracle APEX application. I used a circuit board, wires, capacitors, a wifi device, LEDs and a sensor. I wired them together into a real THING, connected to the internet and start communicating with an Oracle database. Because of the closed API for the REST Service they used I had to build an APEX application to control the thing and report on its data. Had it been an open API, I would have created an ADF or MAF app to do the exact same thing.

During Lunch I sat down at the MAF and ADF table discussing with my peers and also with Oracle Product Management. We also discussed the big announcements made by Oracle where from my perspective Mobile Cloud is the most interesting one. However there are many other clouds that we need to look into. Makes you wonder: Usually when I am that much in the clouds, I tend to call it foggy. I hope Oracle's Cloud Strategy is not foggy, but cristal clear. I know they have a plan, and I know there is a lot of work todo. We discussed the overlap we see in Mobile Cloud and Integration Cloud and also in Application Builder Cloud. Not sure if Oracle will take parts of those clouds and bundle them into a new one, or actually transfer stuff from MCS and ABCS to ICS, where it belongs. We will see.

In the after noon I had to work on a proposal for a Customer so no more sessions till 6.00 pm when the Oracle ACE reception started. It was outside near the pool but still it was very hot so we had to drink plenty of 'water'. After that I continued to the ADF Community night where we used the Wii to play games.

Tuesday
This was the day of my presentation. "Real Life MAF: The things you don't learn from Oracle's Developer Guide". I promised attendees to provide links to my demo's and also to my slides. So here they are:

1 and 2 are available here whereas you can find 4 here.
  1. Loading Images in the background
  2. Action Complete (Programatically showing popup) and fragment demo
    1. Files involved in action complete:
      1. Complaints.amx
      2. fha.js
      3. FlightAppBean.java
  3. Real Sliding Springboard (is provided as Sample App with Oracle MAF)
  4. Local Notifications
The whole presentation is available at slideshare.
After my session I had to run to the Lunch and Learn panel where I was in with co-ACE(D)s Lonneke Dikmans, Debra Lilley, Mia Urman and John Flack. As you can see, the women have the majority, which I think is not necessarily a bad thing. We discussed with the audience what they expect of the next ODTUG KScope, which by the way will be in Chicago! Min observation was that there was too much cloud and mobile and too little ADF. So here is something that we should think about for next conference. Next I joined Lonneke's session called "SOA made Simple". Using a waitress as an analogy for a service bus really makes ordering breakfast a whole other experience.

My afternoon ended with a hacking session with one of the attendees in my session. It was mainly about background threads and refreshing the UI in an MAF App. The only thing you need to do that is to call to flushDataChangeEvent() and it will refresh the UI with all changes from that background thread.
 AdfmfJavaUtilities.flushDataChangeEvent()  
I was even able to refresh the UI from two separate background threads. I will blog about that in a separate post.

My day ended with a great diner with a customer, or better a friend at GG's, a very nice place to have good seafood.

Wednesday
Today was actually a day where I followed several sessions. I started with some support (and coffee) for Lonneke who was talking about Choosing the Right Mobile Architecture. One bummer was the Oracle Mobile Cloud Service Hands On lab, which in the end was not a hands on, but a demo. Unfortunately even for Oracle Product Management there was no way to get access to MCS instances that can be used by the audience. I was really looking forward to get my hands on MCS but this ws not possible. After lunch I attended two more sessions. One of Joe Huang, outbound PM for Oracle MAF.He was talking about "New Core and Sync Services in the Oracle Mobile Application Framework: What Does It Mean for Developers". Nothing really new here for me. The other session of the afternoon was by Raghu Srinivasan. He is Director of Development in the Cloud and Mobile Development Tools, and responsible for Eclipse tooling (OEPE). His session was called "REST Easy with Oracle MAF: Building RESTful MAF Applications in Eclipse (OEPE)". It was a great session where I got to see the power of MAF in eclipse. I mainly (or should I say only) use JDeveloper but this was een eye opener. Amongst a lot of other things he showed how to use a RAML definition to generate all Java you need to work with REST/JSON services. I am convinced now. From now on I will use OEPE more for MAF Development. Great features!! A good tutorial if you just get started can be found here.

The day ended with the white party at Nikki Beach. Great food, great venue, but from the way I look at it, it was a pretty long trip (1 hour one way) for the party.


Thursday
It's a wrap. No time to attend the closing session. We had to be at the airport at 11:30. It was a great KScope again. I met some old friends and found some new ones. I missed several people who I hope to see soon. And I hope to be back next year.