What types of data structures are routinely used for in-memory real time transaction scoring? I've used doubly circular linked lists to store (say) 20 most recent transactions with time stamp and other attributes, per merchant / per customer.
What kind of metrics work well in this context? Among many metrics, I've used last transaction or time to 5-th previous transaction.
Do you use a lot of rather small lookup tables that you can upload in memory, to store historical data, such as merchant summary statistics broken down per day, for the last 3 months (one entry per merchant per day)?
How do you optimize server performance? For instance, at 2am, when the volume of transactions is 5 times lower than at peak time, do you use the analytic servers for other tasks, such as end-of-day re-scoring?
At peak time (severe peaks), do you use a simplified model that requires less memory, if you lack bandwidth?
Have anybody used the Hadoop environment to feed into a true real time processing system (that is, with no latency), such as credit card processing?
For data science ROI to be positive, should advanced analytics / data science costs (in terms of people, extra hardware and software) represent less than 10% of the cost of general computer architecture (servers, engineers, basic data processing and reporting)? Is there a magic number, and if it is not 10%, what would it be?