All Things Techie With Huge, Unstructured, Intuitive Leaps

Event Logs, Process Mining and Artificial Intelligence

In my course on process mining from the Eindoven University of Technology in the Netherlands, on the course forum, a person asked about where to get events logs for process mining.  This was the question posted:

Anyhow, as I was watching the lecture on Guidelines for Event Logging, I was struck by the question that usually occurs to me in such courses: But how to do it in practice?

I'm assuming that logging for the Internet of Things is part of the Things that Make Up that Internet. But otherwise? I absolutely abhor having to program, never struck me as that interesting. So how is it done in practice? Do you guys have preset functions/libraries? In case a human needs to log their behaviour, how do you ensure compliance - that they don't forget etc. etc.?

I'd love to hear more on that!

I took it upon myself to reply, and this is what I said:

I am a technical architect (and Chief Technology Officer !) for an eCommerce platform that deals with high-dollar value goods marketed in exclusive circles. We have had the benefit of creating the technology so we created event logs for everything.  Here is an example:

1) When you log in, we record the time, the username, the IP address of where the login came and whether the user was using a desk computer or a mobile platform.

2) When you check your messages on our system, they are marked as read with a timestamp. That creates another event log.

3) When you go to view offerings, what you look at is recorded, so we can gather data on what the user likes to buy.

4) If the user is a seller, we record what he uploads into a database table, and every entry has a timestamp column to detect when the data was added.
5) Each sale is recorded along with a timestamp.

6) Each log out is recorded, along with a timestamp.  All of this is very easy to do, because when we create a database to store data, we construct it such that each entry (called a row) has a column labeled timestamp where the computer puts the NOW() date/time when the data is recorded.  This automagically creates the event logs for us, for all tasks, even tasks that are consided non-process related (which really add value to our processes).

When you are online, every time data is recorded, a timestamp goes along with it. We have event logs for everything.

But you can paper-based event logs that can be transcribed.  For example, we looked at an auto repair shop  and did a very rudimentary process mining when we were constructing a business app for the company. They took appointments and recorded the time of the call and the time the customer was going to bring the car into the shop. Then the service writer recorded the details on an invoice, service sheet when the customer arrived, and pushed the invoice into a time clock, stamping the arrival time. We then looked at the mechanic's time sheet to see what hours that he billed for the job. Then we know when the customer picked it up, because the payment invoice was timestamped (usually with the cash register receipt).  They had a complete event log on various bits of paper floating around the business, and once the business was computerized, they could determine the bottlenecks. (Which turned out to be waiting for ordered parts),  This was my first experience of events logs in a non-computerized fashion.  Since then, I timestamp every database table that I construct -- even the ones which store metadata and mined data such as standard deviation of an aggregate of characteristics of our top buyers and sellers.

There is now a burgeoning field of using non-sql graph databases (we like neo4j) which can map semantic and/or fuzzy relationships very easily, and this course has taught me to timestamp graphs and edges to monitor significant but transient process relationships in the business milieu.

Hopes this helps,

Most people do not realize how many internet tracks they leave that are event logs when they do ordinary things online. The Germans have a word for this. It is not a nice word. It translates to "digital slime".

Event logs are ubiquitous, and it is my contention that all of these digital tracks and Big Data will lead to a plethora of training data for artificial intelligence. Artificial neural nets need concrete examples to learn from and iterate and re-iterate to "get smart.  Process mining will be huge in that respect. Process mining is the first step for computers to learn human behavior. Neural net machines and multilayer perceptrons will pore over process maps gleaned from human behavior and learn to mimic and reproduce expert behavior in a very more and more repeatable fashion that humans can.

No comments:

Post a Comment