HPI Course I, post-lesson 1
course_1 Dominikus Schmidt
Has a podcast
Sep 3 at 5:58pm

Remarks, Feed-Backs, Questions


fb:
the audio-quality is a bit poor: pretty "dumb" and low-volumed, so that a respectable
perzeption-le

vel is hard to achieve.
But: NOT noisy = good chance for improvement
( = higher "gains" at the mixing-desk & some more trebles)!

r:
1. I felt it courageous, to go for a "column"-wise approach: usually (not on rdbms!)
height-balanced (binary-) node-trees are THE method for "static", no longer
in-use transactional data, - as for it's "unbeatable" quick access, due to minimal
touches/"taken-in-hand" per request to/of  the big amounts of data, - in particular, if these are DRAM-based. But we'll see ...

2. the column-approach reminds me to my n-dim-"cube" for data-validation of the "new" dwh-platform for social-researchers in the huge database of the IAB with all the german social-insurance-data up from 1974. (SAS-engine, 1999) "Columns" in that respect were closely refering to 'data-types'.

q: We are apparently talking about non-transactional-data, aren't we?
UPDATED below from 3. Sept. 11 pm:

Ok, we are NOT!!
I'm eager to see how the time-to-time inconsistency of the TA-data (during TAs) is regarded, managed or no longer appearing, or neglectable, or ...

In the first video HP pretty frankly speaks out , that he had quite solely to work on aggregates of the ta-data for his whole 40 years of IT-life. This reflects the one strong SELLING preposition point of SAP: "reporting", business intelligence (bi) and so on, facing at the selling-stage just and only the MANAGEMENTS, that were usually very impressed by THIS strong site of SAP, neglecting the weaker
"transactional" part, where the actual work of the companies took place, which the management had to inspect, to control and to steer (on an aggregrate-basis).

As the transactional work was nearly complete oriented to fulfill this purpose of
producing/delivering proper figures, ratios and overviews, the efficiency-level
of singleton- and workflow-transacting diminished often to poor stages, as
the efficiency of work and workflow had not the leading role at the design- and
"customizing" - phase of the SAP-implementation, - but the "bi" kept it.

Due to these failures (and others of course: where those occur, are furthermores
not far away!) at workstage-levels, the following "going-downs" up to smashing
bankrupts could claim/maintain by every right, beening the "best managed ones" ever seen in economic history since ..., as some owners & the most out of the management could often leave the sinking ship IN TIME, while employees and some suppliers mostly stuck to the trap.

To me, the more skill-demanding and challenge-of-the-art IT-part for managerial "bi" is to form first the actual work by SAP-modules and set upon this the aggregational bi, reporting and the other managerial stuff, without hindering/objecting the work-processes.

r:
At the quizzle my first pulse was to check the "database" (2. question) as well, but couldn't remember this mentioned in the script or the slides  - and left it out. The search afterwards in the material rendered NO explicit outpoint for it, but it is okay to rely also on implicit content, -  = less boring and an index to the possible depth of understanding -,  but a hint to the participants for this policy at the beginning of the course might be useful to all parties.


r:
context-specific meanings:
compression = (mainly?) degree of redux of the amount of data by aggregation!
not by "compression"-tools like zip.
HP:    c. NOT for performance-reasons NOW (in the new concept) , but possibly for something else  ...
me: 1. actual goal/target is often the aggregated material (overviews, summaries, saldi, ratios etc.) itself,  2. the thrilling point comes on data-integrity, when fishing in the transaktional seas, where data always are subject to change ...see above VERY INTERESTING.
r:

  0 0