BaseVISor Inference Engine - forward-chaining inference engine specialized to handle facts in the form of RDF triples with support for RuleML, R-Entailment and XML Schema Datatypes.
Creately - Online diagram and design tools. No Software Downloads, No Installs. Draw and Design right in your browser.
headup - The Firefox addon that helps you discover content related to your interests and your friends.
yoono - Manage all your social networks and IM services in one place.
I spent the last several weeks working on a prototype application I’m calling RDFBurner. The purpose of this application is to provide a user interface that allows for the rapid composition of triples. The triples will be parsed and used to compose RDF. The user interface provides four entries:
The idea is that you type a new Domain in or choose an existing one from a list, then type or choose a subject, predicate and object. Then save that to create a new triple. Below the entries, you’ll see a list of previously entered triples. The intent behind the user interface is to provide instant visual feedback of your work. Seeing the triples laid out in a tabular modular fashion provides you with many clues as to what sorts of triples you would need to describe your particular domain.
During the construction of the prototype, I became frustrated with the complexity of the code. Simply developing the user interface, I could see already how the code was turning into a tangled web of gibberish. This is a fairly typical result when coding to Microsoft’s “Event Driven” programming paradigm. The idea of Events letting your program know that some thing has occurred such as a key press or a record selection is, on the surface, a good idea. But in practice, if you’re not careful (and Lord knows I’m careful but I want to see results quickly so I’m not that careful) you end up with a big mess. Crafting OOP classes helps to some degree but generally, the paradigm leads to sloppy difficult to follow code.
So, once again, I began to think about the challenge of managing the complexities of a user interface. I realize that it might be possible to de-couple the intent of the user from the features in the program, thereby adding some much needed programming structure. If a user gestures by pressing a key or clicking a record in a grid, I could simply make a note of it and move on. Later, a timer in the program would wake up periodically and process these gestures. When an event fires, the logic would be extremely simple as the only task is to queue up the user’s intent. Features in the program would be simplified as they could be coded as very atomic local scope processes.
- When an event fires such as Control_KeyDown or Form_SelectionChange, create a Gesture object that represents the intent of the user’s action.
- Using a map, translate that gesture object into one or more program features invocation objects.
- Send an invocation object to each active form/control that supports those features.
The code snippet below shows that during an event, we simply queue up a new Gesture object (that’s what LogGesture does). Presumably, some program feature is going to know what to do with the mouse and keyboard focus so the event framework’s key press is suppressed by setting the KeyCode equal to zero.
|Visual Basic |||copy code |||?|
Private Sub Entry_KeyDown(KeyCode As Integer, Shift As Integer)
Select Case KeyCode
Case 9 'Tab key
LogGesture Me.Tag & "_Entry_Tab", CStr(Shift)
KeyCode = 0
Case 40 ' Arrow Key Down
LogGesture Me.Tag & "_Entry_Arrow", "Down"
KeyCode = 0
Case 38 ' Arrow Key Up
LogGesture Me.Tag & "_Entry_Arrow", "Up"
KeyCode = 0
Now later, a timer event will fire and process the Gesture objects waiting in the Queue. In my prototype, I have the timer set to 50 msec. This gives the application a nice snappy feeling. From the code snippet below, you should be able to see that the Gesture is translated into one or more Features. Each feature is supported by one or more forms. Each form is notified that the feature needs to be invoked.
|Visual Basic |||copy code |||?|
Private Sub Form_Timer()
' Translate GlobalGestures Into Features
Dim pGesture As Gesture, pFeature As Feature, pFeatureQueue As FeatureQueue, pFormWithFeature As FormWithFeature, pFormsWithFeatureList As FormsWithFeatureList
For Each pGesture In globalGestures ' Process Gestures Loop
Set pFeatureQueue = TranslateGestureIntoFeatures(pGesture)
If Not (pFeatureQueue Is Nothing) Then
For Each pFeature In pFeatureQueue
Set pFormsWithFeatureList = GetFormsThatSupportFeature(pFeature)
If Not (pFormsWithFeatureList Is Nothing) Then
For Each pFormWithFeature In pFormsWithFeatureList ' For every form that supports the feature
pFormWithFeature.Form.FeatureQueue.Add pFeature ' queue up the feature - note this a form level object unlike the GestureQueue which is global
pFormsWithFeatureList.Remove pFormWithFeature ' We're done so get rid of the form from the list
Set pFeature = Nothing
' Check for any incoming feature requests
For Each pFeature In FeatureQueue ' Process incoming Features Loop
DispatchFeature pFeature ' Execute the subroutine associated with the feature
Finally, the code above is running in what I consider the “master” form which has it’s own queue of feature requests. The queues here are very simple. The DispatchFeature does nothing but checks the name of the Feature from the FeatureObject and calls the subroutine with the same name.
What about RDF?
Because I am building an RDF composition utility, it makes sense to encode the gesture to program feature map in RDF and use SPARQL to retrieve (parts of) the map at runtime. I’ve not done this yet in my current prototype. The purpose of the RDFBurner prototype is to compose triples that will be converted into RDF - a task which I have yet to program. The gesture to program feature mapping is currently implemented as an SQL Query against my rather simple triple store. Once I start actually generating RDF with the utility, I’ll refactor the translator portion to use RDF and SPARQL.
Inserting an RDF map into the Event logic of a typical application has some interesting connotations:
- The resulting RDF could possibly be combined the mapping RDF of other applications, implying the compositions of a type of mash-up that works at the user interaction level.
- Events are not the same as Gestures. For instance, when a user presses the up-arrow key while the focus is on the top row of a grid, the Event might be evntGrid_KeyPressed(UpArrow) while the gesture or intent of the end user is to move the focus to some other control above the grid, i.e. gstrGrid_Leave(Up). An Event is hardware/form/control kind of thing. A Gesture may be triggered by an Event but represents what the user really wants to do.
- It highly likely that there should be a translation mapping in RDF between Events and Gestures.
- Once the mapping is in RDF, it may be possible to infer patterns and sets of patterns, especially if the RDF is coaxed into OWL. We could use such inferred knowledge to generate at minimum a skeleton of the programming logic. Obviously other parts of a given application would make use of RDF knowledgebases and the combination of the classes involved could lead to an Ontology Driven Design.
Out of the blue, the Artificial Intelligence group started having a discussion about the possibility of getting a meeting together. In order to do this, we’ll first need to send out a notice to the group but none of us have this capability so I have IMed Ichiro Tokugawa, Betwixt Epsilon, and Corro Moseley and requested that they contact me so we can send out a notice. All three were offline. Hopefully I’ll hear back from someone soon. I also heard from Kore Jardberg that Thothica is OK with our hosting the meeting on their land. Here’s the transcript from the chat:
[14:08] Kore Jardberg: Hi, is there a meeting organised sometimes? I didn’t see any for now, but I’m new here. Thanks in advance.
[14:09] Alexis Lange: I haven’t seen one for some time
[14:10] spood Udimo: I’d be interested in hosting one
[14:10] Alexis Lange: Ok, sounds good to me
[14:11] Kore Jardberg: You own a place?
[14:11] spood Udimo: but most meetings Ive been to in sl are small turnout
[14:11] spood Udimo: yes. a small place on the main land
[14:11] Alexis Lange: Yes, i do
[14:11] spood Udimo: but my shop is rather small. if we got lots of people, id be in trouble I think
[14:11] Xen Akula: Even a simple chat meeting would be nice. Lots of groups do that to make it easier on peeps.
[14:12] Alexis Lange: Yes, the OSS limit is 10 i think
[14:12] Kore Jardberg: Maybe be Thothica or Second Philosophy would agree to host the meeting
[14:13] Alexis Lange: thats an idea
[14:13] Kore Jardberg: It usually brings how many people?
[14:13] Kore Jardberg: Approx.
[14:13] spood Udimo: Never been, Kore. but me and a couple of my buddies are interested in chat bots. is there any interest in this area? “Some exciting stuff out there
[14:15] Alexis Lange: could you explain “chat bots ” ?
[14:15] Elbereth Witte: I haven’t seen many intellgient chatbots
[14:15] spood Udimo: pandorabots.com
[14:15] Kore Jardberg: Do you guys have enough knowledge to make a small presentation to start the meeting? Like showing what you’ve done?
[14:15] spood Udimo: hoo boy. maybe in a couple weeks. I’d have to check with my buddy
[14:16] spood Udimo: hes’ much further along than i
[14:16] Alexis Lange: personally, i haven’t been active enough to say yes
[14:17] Alexis Lange: but if the group were more active, i probably would get more involved
[14:18] spood Udimo: ditto
[14:19] spood Udimo: .
[14:19] Kore Jardberg: Ok, personally I’d like to present new research and findings in implementing artificial consciousness, and to discuss around this topic if some people are interested, but not for now, in two monthes
[14:20] Elbereth Witte: worst case secaniro, I know a sandbox we can storm
[14:21] Kore Jardberg: Hmmm… SL sent me an error message, I don’t know if this message was correctly sent so I repeat it: Ok, personally I’d like to present new research and findings in implementing artificial consciousness, and to discuss around this topic if some people are interested, but not for now, in two monthes
[14:21] spood Udimo: I think we’d need to get in touch with the group lead first or an officer. can’t post notices
[14:21] Elbereth Witte: SL lies about nondelivery
[14:21] spood Udimo: We’d have to get in touch with someone who can post notices. I can’t
[14:22] Kore Jardberg: Good idea
[14:22] Kore Jardberg: Do you know them, Spood?
[14:23] Kore Jardberg: Elbereth: ok, didn’t know it
[14:24] spood Udimo: owner is Ichiro Tokugawa but I dont now
[14:25] Elbereth Witte: Ichiro Tokugawa, Betwixt Epsilon, and Corro Moseley are the people who can do taht
[14:25] spood Udimo: but it sounds like we could scratch together some sort of plan. I think it might be like a month out before we have anything to present
[14:25] spood Udimo: k
[14:27] spood Udimo: I sent an IM to Ichiro, we’ll see
[14:28] spood Udimo: I’ll post at http://slholmes.org/ in the mean time as things progress
[14:28] Elbereth Witte: corro is the most recently online one
[14:29] Kore Jardberg: Ok
I installed an “artificial intelligence” bot at my work shop in Second Life.
The bot is made up of three parts:
- A hosted account on pandorabots.com (Thanks Ennui!)
- A body form sculpture made out of Second Life prims (Thanks Jan!)
- A wonderful script by Angela Talamasca that communicates with my account on pandorabots.com.
Use the pandorabots.com chat window to communicate with the bot without having to go into Second Life.
I also downloaded the source code for Knowee. Knowee is a distributed address book implemented in PHP and MySQL. Knowee looks like it provides a SPARQL endpoint which I’m sure I can make good use of. I’m planning on using that to experiment with RDF at the workshop. I hope to get lots of semantic web stuff hooked up to the bot and other features of the workshop.
In some ways, everything we have so far looks rather primitive but with some time and energy, the ability of the workshop to inform and realistically interact with you will be astounding.