On My Watch

Wednesday, June 01, 2011

4.5

4.5 out of 10.

That's how firms rate their use of analytics, according to Michael Hopkins, editor-in-chief of MIT Sloan Management Review as described here. Moreover, Brad Peterson, Schwab CIO says get marketing to pay rather than IT. He used mobile app in his example thinking, I think, about capturing mobile data for analysis but why not use marketing budgets to better drive marketing analytics more broadly. Better yet, align costs to revenue and profits. And enable ROI calculations then maximize ROI.

Here are a few places to start:
  • Move beyond web and email analytics. While hits, visitor counts, email opens and the like are critical, they are not enough. Marry this data with target company organizational structures, social graphs, CRM data and other internal and external data to get a full picture of the market or slice into territories.
  • Let people drive the analytics based on their roles and needs. Sales reps can mine the data to look for past deal patterns and possible triggering events in search of next deal. Marketing can analyze previous campaigns and plot the next one. Managers, directors and C-Level can watch for longitudinal trends.
  • View marketing data through different lenses. For example, sales interested in this quarter, marketing next and biz dev 2-3 quarters away in adjacent markets.
In short, capture whatever data you can from the market and from the field and let your servers and analytics algorithms do the work to crunch the data for you. And use the analytics in a feedback loop: Learn from how the system is used to determine results and let the measurable actions of those responsible for quota or P&L drive ROI calculations.

And turn 4.5 into 9's and 10's.

Labels: ,

Sunday, May 01, 2011

Real Time Web and Marketing Automation

I am reposting graphic from earlier post but applying to marketing automation. Although quite generic and almost ancient history (2008), it surprisingly pertains I think. But Real Time Web is People-to-People and so is B2B marketing.



Traditional marketing automation concerns itself primarily with managing campaigns (landing pages, etc) and scoring, routing and nurturing marketing leads or, more generically, contacts. Making real time means:
  • 1:1 marketing - letting customers find and communicate with marketers while actively engaged. And vice-versa.
  • Meaningful bidirectional notifications - keep customers informed based on their interest and marketers informed of customer actions and, critically, relationships
  • Viewing marketing and sales process as well as a series of events - behavioral events, conversion events, inquiries, transactions
  • Searching lead and other internal marketing data for contact, lead, prospect and customer actions - or of those in their business network. On demand and continually. So while customers are googling you or more usually solutions to their problem or deals, marketers can be looking for them.
  • Correlate events and data across prospect network to uncover hidden triggering events.


Labels: , ,

Friday, March 19, 2010

Chorus, Atoms and Chairman

Is this the new nomenclature for cloud computing, specifically databases for the cloud? No more records, transaction managers and the like. In one of the more interesting sessions I have seen, yesterday's presentation at MIT by Jim Starkey and Nimbus DB described his vision of a cloud database.


In NimbusDB speak, a chorus is (I think) a set of Peer-to-Peer database nodes all exchanging metadata messages about their data. The db is based on adding the network layer, along with memory, disk etc, to the pyramid of data access resources linking a set of nodes on the network.

Atoms are 50k chunks of data (Why not 64K?) for both metadata and application data.

The Chairman is responsible for managing read and write permissions to the atoms.

I definitely buy the theory and, having recently finished an book on Einstein, appreciate the references to Relativity. Not all database transactions (e.g. some reads) require immediate consistence. Eventual consistency (as long as eventual is within application specific tolerance windows) but not immediate. But I also have questions - specifically about time stamps, impact of version unavailability, whether chairman have live synchronized backups and performance impact of message chatter.

Labels: ,

Tuesday, December 01, 2009

Cloud DB's

SQL-NoSQL, MVCC-Distributed MVCC, Relational-Non relational .....
The challenges are myriad for managing data in the cloud, particularly since traditional databases (Oracle, SQLServer, MySQL etc.) are difficult and expensive to scale and virtualize, requiring full backups for "instant" virtualization.

On the other hand, the choices are expanding. NoSQL, such as Google's BigTable and Amazon's SimpleDB, is gaining real adherents as it piles up successes (just don't ask for joins.) Joins in a massively distributed environment will probably require a distributed relational database based on MVCC or some other technique to ensure queries against a consistent snapshot. While not yet proven (AFAIK), this holds real promise. Many of the applications I am working on currently, including Innerpass' collaboration application, do require relational queries but they don't require subsecond response. And they could also benefit from the ability to handle spikes without provisioning db servers. Eventual consistency is actually a good match with asynchronous Ajax-based page refreshes where temporarily inaccurate or incomplete link lists can be tolerated.


My guess it will come down, like most (if not all) technologies and architectures before it, to application requirements and business imperative. First off, many apps won't need to change and won't change - it it ain't broke, don't fix it. Greenfield applications and, in some instances, applications that need to modernize - as they are exposed to larger user bases, for example, will need to consider cloud architectures. Technology adoption and adoption rates will be driven by the nature of the application, cost and time-to-market concerns. High volume, low data value sites will tend, from what I can see, toward non relational deployments, data warehouses to columnar and MapReduce hybrids and transactional applications will remain dependent on relational databases. The biggest questions are the in between ones. And the type of relational database will be very performance and latency dependent. Those with high latency tolerance may find distributed databases acceptable and even preferrable while those demanding higher throughput and consistency will stick with tried and true.

Labels: , ,

Monday, August 03, 2009

Boiling It Down

Buckets of ink and electrons have been spilled answering "What is Cloud Computing" - ranging from the overly stingent (requiring a specific hardware and/or software formalism) to overly broad (calling everything on the Internet a cloud application). Here is what it boils down to for me:
  • Elasticity: The resources - hardware, software, network - that the application needs to run needs are elastic. As the application usage goes up, it should be able to access more of what it needs, from CPU cycles and horsepower to database "rows" and disk space. And as it goes down, these resources should be released for other applications to use. In other words, the elasticity of a single computer (where CPU, memory and other resources are shared across all the applications on that box) is spread across the data center, across clients and servers.
  • Pay for what you use: Implicit - and a (perhaps the) primary benefit of elasticity is economic - aligning costs with use. As use (and resources) go up and down, so should the costs.
  • Easy to Provision/Order/Pay For - This is about on ramping and adding resources. In a completely auto-elastic system, provisioning is implicit. But even if provision is not automatic, it still needs to be easy.
So .... cloud computing is both business model (yes whether public or private) and technology and these attributes generally cut across both. Elastic resources really includes the technology (enable it) and the ability to track what is used to support the business model, whatever its specifics.

Labels: ,

Friday, June 19, 2009

My (Personal) Activity Monitor?

Like waiting for my jetpack (still waiting...) I am also waiting for an activity monitor - like Business Activity Monitor (BAM) but for me - a Personal Activity Monitor (PAM). After years of promises and slew of technologies - both corporate (data warehouses, real time XML ..) and consumer (IM, RSS readers, ...), I know everything is possible and many things done. But I am just looking for an easy way to track things potentially complex things about me and my accounts, not just online social networking and communication accounts (Facebook, Twitter, LinkedIn, GMail, Skype ...) but transactional accounts (e.g. bank accounts). It is my information (so I should know it) but I don't always - and more importantly I want to monitor transactions and transaction status - and be alerted if something goes wrong or something is just not right.

I have wanted this for some time but, recently, there was fraud activity on my personal bank account. So now I really want it. I only found out because I happened to be log onto account for another reason - and I saw suspicious activity that I should have been notified of. (Essentially several unauthorized transactions over a 2 day period on the opposite - West - coast for unauthorized withdrawals 4-5 times bigger than either my wife or I usually take out). And if it didn't trigger bank algorithms, I should have been able to set up my own. Now, I did get the money back (but it cost me several hrs, some embarassment of bounced check on a closed account and still chasing the fee reimbursements), but it sure would have been better to catch it before it happened - or at least before it happened second or third time. Save the bank the money too since they had to cover it (and ultimately the country since this is a bank receiving billions in gov't bailout money). And the fact that the bank's systems were not set up just strengthens the impetus to push the filtering algorithms out to the users.

My credit card company allows me to configure various alerts, such as bills due or thresholds met and my brokerage account lets me set up simple trading triggers. But noone (at least none of my providers) let me set up more than simple monitors (like a threshold alert) that would catch a pattern preceding my bank account fraud. The amounts alone should have triggered an alert, or at least a watch. And noone lets me input other data (my travel schedule, for one thing, would have said I wasn't in California on those days) to process against the data they already have.

Providing consumers the ability to set up filters like this would actually save institutions/businesses (like banks) money (in analysis and engineering overhead) AND produce a better result I think - more customized, more able to take advantage of what I know about myself - more situationally aware.
Not sure exactly what it is (control of a Facebook plus SECURE access to real transactional data?) but I want my PAM...

Labels: , ,

Wednesday, June 17, 2009

SSO for Clouds
How close are we?

Now that SFDC supports SAML 2.0 (plus oAuth) standards for Single Sign On, that joins them up with Google AppEngine as 2 of the major cloud providers on a common path. This should bode well for SSO I would think. However, I think Amazon, particularly for S3, needs to get on board, since they are by far the biggest player in IaaS with the greatest number of independent developers, for the tipping point to come. EC2-based apps can obviously be SSO-enabled by the app developer deploying on it, but doesn't t it make sense to have this facilitated for app developers who are not SSO experts) ? And SSO, as I said, for S3 in particular opens up a lot of options for collaborative applications, options for collaboration that is also secure and managed.