The Orphaned Internet Domain Risk

I have clicked on company websites of social media acquaintances, and something is not right: Slight errors in formatting, encoding errors for special German characters.

Then I notice that some of the pages contain links to other websites that advertize products in a spammy way. However, the links to the spammy sites are embedded in this alleged company websites in a subtle way: Using the (nearly) correct layout, or  embedding the link in a ‘news article’ that also contains legit product information – content really related to the internet domain I am visiting.

Looking up whois information tells me that these internet domain are not owned by my friends anymore – consistent with what they actually say on the social media profiles. So how come that they ‘have given’ their former domains to spammers? They did not, and they didn’t need to: Spammers simply need to watch out for expired domains, seize them when they are available – and then reconstruct the former legit content from public archives, and interleave it with their spammy messages.

The former content of legitimate sites is often available on the web archive. Here is the timeline of one of the sites I checked:

Clicking on the details shows:

  • Last display of legit content in 2008.
  • In 2012 and 2013 a generic message from the hosting provider was displayed: This site has been registered by one of our clients
  • After that we see mainly 403 Forbidden errors – so the spammers don’t want their site to be archived – but at one time a screen capture of the spammy site had been taken.

The new site shows the name of the former owner at the bottom but an unobtrusive link had been added, indicating the new owner – a US-based marketing and SEO consultancy.

So my take away is: If you ever feel like decluttering your websites and free yourself of your useless digital possessions – and possibly also social media accounts, think twice: As soon as your domain or name is available, somebody might take it, and re-use and exploit your former content and possibly your former reputation for promoting their spammy stuff in a shady way.

This happened a while ago, but I know now it can get much worse: Why only distribute marketing spam if you can distribute malware through channels still considered trusted? In this blog post Malwarebytes raises the question if such practices are illegal or not – it seems that question is not straight-forward to answer.

Visitors do not even have to visit the abandoned domain explicitly to get hacked by malware served. I have seen some reports of abandoned embedded plug-ins turned into malicious zombies. Silly example: If you embed your latest tweets, Twitter goes out-of-business, and its domains are seized by spammers – you Follow Me icon might help to spread malware.

If a legit site runs third-party code, they need to trust the authors of this code. For example, Equifax’ website recently served spyware:

… the problem stemmed from a “third-party vendor that Equifax uses to collect website performance data,” and that “the vendor’s code running on an Equifax Web site was serving malicious content.”

So if you run any plug-ins, embedded widgets or the like – better check out regularly if the originating domain is still run by the expected owner – monitor your vendors often; and don’t run code you do not absolutely need in the first place. Don’t use embedded active badges if a simple link to your profile would do.

Do a painful boring inventory and assessment often – then you will notice how much work it is to manage these ‘partners’ and rather stay away from signing up and registering for too much services.

Data for the Heat Pump System: Heating Season 2016-2017

I update the documentation of measurement data [PDF] about twice a year. This post is to provide a quick overview for the past season.

The PDF also contains the technical configuration and sizing data. Based on typical questions from an ‘international audience’ I add a summary here plus some ‘cultural’ context:

Building: The house is a renovated, nearly 100-year old building in Eastern Austria: a typical so-called ‘Streckhof’ – an elongated, former small farmhouse. Some details are mentioned here. Heating energy for space heating of two storeys (185m2) and hot water is about 17.000-20.000kWh per year. The roof / attic had been rebuilt in 2008, and the facade was thermally insulated. However, the major part of the house is without an underground level, so most energy is lost via ground. Heating only the ground floor (75m2) with the heat pump reduces heating energy only by 1/3.

Climate: This is the sunniest region of Austria – the lowlands of the Pannonian Plain bordering Hungary. We have Pannonian ‘continental’ climate with low precipitation. Normally, monthly average temperatures in winter are only slightly below 0°C in January, and weeks of ‘ice days’ in a row are very rare.

Heat energy distribution and storage (in the house): The renovated first floor has floor loops while at the ground floor mainly radiators are used. Wall heating has been installed in one room so far. A buffer tank is used for the heating water as this is a simple ‘on-off’ heat pump always operating at about its rated power. Domestic hot water is heated indirectly using a hygienic storage tank.

Heating system. An off-the-shelf, simple brine-water heat pump uses a combination of an unglazed solar-air collector and an underwater water tank as a heat source. Energy is mainly harvested from rather cold air via convection.

Addressing often asked questions: Off-the-shelf =  Same type of heat pump as used with geothermal systems. Simple: Not-smart, not trying to be the universal energy management system, as the smartness in our own control unit and logic for managing the heat source(s). Brine: A mixture of glycol and water (similar to the fluid used with flat solar thermal collectors) = antifreeze as the temperature of brine is below 0°C in winter. The tank is not a seasonal energy storage but a buffer for days or weeks. In this post hydraulics is described in detail, and typical operating conditions throughout a year. Both tank and collector are needed: The tank provides a buffer of latent energy during ‘ice periods’ and it allows to harvest more energy from air, but the collector actually provides for about 75% of the total ambient energy the heat pump needs in a season.

Tank and collector are rather generously sized in relation to the heating demands: about 25m3 volume of water (total volume +10% freezing reserve) and 24m2 collector area.

The overall history of data documented in the PDF also reflects ongoing changes and some experiments, like heating the first floor with a wood stove, toggling the effective area of the collector used between 50% and 100%, or switching off the collector to simulate a harsher winter.

Data for the past season

Finally we could create a giant ice cube naturally. 14m3 of ice had been created in the coldest January since 30 years. The monthly average temperature was -3,6°C, 3 degrees below the long-term average.

(Re the oscillations of the ice volume are see here and here.)

We heated only the ground floor in this season and needed 16.600 kWh (incl. hot water) – about the same heating energy as in the previous season. On the other hand, we also used only half of the collector – 12m2. The heating water inlet temperatures for radiators was even 37°C in January.

For the first time the monthly performance factor was well below 4. The performance factor is the ratio of output heating energy and input electrical energy for heat pump and brine pump. In middle Europe we measure both energies in kWh 😉 The overall seasonal performance factor was 4,3.

The monthly performance factor is a bit lower again in summer, when only hot water is heated (and thus the heat pump’s COP is lower because of the higher target temperature).

Per day we needed about 100kWh of heating energy in January, while the collector could not harvest that much:

In contrast to the season of the Ice Storage Challenge, also the month before the ‘challenge’ (Dec. 2016) was not too collector-friendly. But when the ice melted again, we saw the usual large energy harvests. Overall, the collector could contribute not the full ‘typical’ 75% of ambient energy this season.

(Definitions, sign conventions explained here.)

But there was one positive record, too. In a hot summer of 2017 we consumed the highest cooling energy so far – about 600kWh. The floor loops are used for passive cooling; the heating buffer tank is used to transfer heat from the floor loops to the cold underground tank. In ‘colder’ summer nights the collector is in turn used to cool the tank, and every time hot tap water is heated up the tank is cooled, too.

Of course the available cooling power is just a small fraction of what an AC system for the theoretical cooling load would provide for. However, this moderate cooling is just what – for me – makes the difference between unbearable and OK on really hot days with more than 35°C peak ambient temperature.

Computers, Science, and History Thereof

I am reading three online resources in parallel – on the history and the basics of computing, computer science, software engineering, and the related culture and ‘philosophy’. An accidental combination I find most enjoyable.

Joel on Software: Joel Spolsky’s blog – a collection of classic essays. What every developer needs to know about Unicode. New terms like Astronaut Architects and Leaky Abstractions. How to start a self-funded software company, how to figure out the price of software, how to write functional specifications. Bringing back memories of my first encounters with Microsoft VBA. He has the best examples – Martian Headsets to explain web standards.

The blog started in 1999 – rather shortly after I had entered the IT industry. So it is an interesting time capsule, capturing technologies and trends I was sort of part of – including the relationship with one large well-known software company.

Somewhere deep in Joel’s blog I found references to another classic; it was in an advice on how to show passion as an applicant for a software developer job. Tell them how reading this moved you to tears:

Structure and Interpretation of Computer Programs. I think I have found the equivalent to Feynman’s Physics Lectures in computer science! I have hardly ever read a textbook or attended a class that was both so philosophically insightful and useful in a hands-on, practical way. Using Scheme (Lisp) as an example, important concepts are introduced step-by-step, via examples, viewed from different perspectives.

It was amazing how far you can get with purely Functional Programming. I did not even notice that they had not used a single assignment (Data Mutation) until far into the course.

The quality of the resources made available for free is incredible – which holds for all the content I am praising in this post: Full textbook, video lectures with transcripts, slides with detailed comments. It is also good to know and reassuring that despite the allegedly fast paced changes of technology, basic concepts have not changed that much since decades.

But if you are already indulging in nostalgic thoughts why not catch up on the full history of computing?

Creatures of Thought. A sublime book-like blog on the history of computing – starting from with the history of telephone networks and telegraphs, covering computing machines – electro-mechanical or electronic, related and maybe unappreciated hardware components like the relay, and including biographic vignettes of the heroes involved.

The author’s PhD thesis (available for download on the About page) covers the ‘information utility’ vision that was ultimately superseded by the personal computer. This is an interesting time capsule for me as well, as this story ends about where my personal journey started – touching personal PCs in the late 1980s, but having been taught the basics of programming via sending my batch jobs to an ancient mainframe.

From such diligently done history of engineering I can only learn not to rush to any conclusions. There are no simple causes and effects, or unambiguous stories about who invented what and who was first. It’s all subtle evolution and meandering narratives, randomness and serendipity. Quoting from the post that indicates the beginning of the journey, on the origins of the electric telegraph:

Our physics textbooks have packaged up the messy past into a tidy collection of concepts and equations, eliding centuries of development and conflict between competing schools of thought. Ohm never wrote the formula V = IR, nor did Maxwell create Maxwell’s equations.

Though I will not attempt to explore all the twists and turns of the intellectual history of electricity, I will do my best to present ideas as they existed at the time, not as we retrospectively fit them into our modern categories.

~

Phone, 1970s, Austria

The kind of phone I used at the time when the video lectures for Structure and Interpretation of Computer Programs had been recorded and when I submitted my batch jobs of Fortran code to be compiled. I have revived the phone now and then.

 

Simulations: Levels of Consciousness

In a recent post I showed these results of simulations for our heat pump system:

I focused on the technical details – this post will be more philosophical.

What is a ‘simulation’ – opposed to simplified calculations of monthly or yearly average temperatures or energies? The latter are provided by tools used by governmental agencies or standardization bodies – allowing for a comparison of different systems.

In a true simulation the time intervals so small that you catch all ‘relevant’ changes of a system. If a heating system is turned on for one hour, then turned off again, he time slot needs to be smaller than one hour. I argued before that calculating meaningful monthly numbers requires to incorporate data that had been obtained before by measurements – or by true simulations.

For our system, the heat flow between ground and the water/ice tank is important. In our simplified sizing tool – which is not a simulation – I use average numbers. I validated them by comparing with measurements: The contribution of ground can be determined indirectly; by tallying all the other energies involved. In the detailed simulation I calculate the temperature in ground as a function of time and of distance from the tank, by solving the Heat Equation numerically. Energy flow is then proportional to the temperature gradient at the walls of the tank. You need to make assumptions about the thermal properties of ground, and a simplified geometry of the tank is considered.

Engineering / applied physics in my opinion is about applying a good-enough-approach in order to solve one specific problem. It’s about knowing your numbers and their limits. It is tempting to get carried away by nerdy physics details, and focus on simulating what you know exactly – forgetting that there are huge error bars because of unknowns.

This is the hierarchy I keep in mind:

On the lowest level is the simulation physics, that is: about modelling how ‘nature’ and system’s components react – to changes in the previous time slot. Temperatures change because energies flows, and energy flows because of temperature differences. The heat pump’s output power depends on heating water temperature and brine temperature. Energy of the building is ‘lost’ to the environment via heat conduction; heat exchangers immersed in tanks deposit energy there or retrieve it. I found that getting the serial connection of heat exchangers right in the model was crucial, and it required a self-consistent calculation for three temperatures at the same point of time, rather than trying to ‘follow round the brine’. I used the information on average brine temperatures obtained by these method to run a simplified version of the simulation using daily averages only – for estimating the maximum volume of ice for two decades.

So this means you need to model your exact hydraulic setup, or at least you need to know which features of your setup are critical and worthy to model in detail. But the same also holds for the second level, the simulation of control logic. I try to mirror production control logic as far as possible: This code determines how pumps and valves will react, depending on the system’s prior status before. Both in real life and in the simulation threshold values and ‘hystereses’ are critical: You start to heat if some temperature falls below X, but you only stop heating if it has risen above X plus some Delta. Typical brine-water heat pumps always provide approximately the same output power, so you control operations time and buffer heating energy. If Delta for heating the hot water buffer tank is too large, the heat pump’s performance will suffer. The Coefficient of Performance of the heat pump decreases with increasing heating water temperature. Changing an innocuous parameter will change results a lot, and the ‘control model’ should be given the same vigilance as the ‘physics model’.

Control units can be tweaked at different levels: ‘Experts’ can change the logic, but end users can change non-critical parameters, such as set point temperatures.We don’t restrict expert access in systems we provide the control unit for. But it make sense to require extra input for the expert level though – to prevent accidental changes.

And here we enter level 3 – users’ behavior. We humans are bad at trying to outsmart the controller.

[Life-form in my home] always sets the controller to ‘Sun’. [little sun icon indicating manually set parameters]. Can’t you program something so that nothing actually changes when you pick ‘Sun’?

With heat pumps utilizing ground or water sources – ‘built’ storage repositories with limited capacity – unexpected and irregular system changes are critical: You have to size your source in advance. You cannot simply order one more lorry load of wood pellets or oil if you ‘run out of fuel’. If the source of ambient energy is depleted, the heat pump finally will refuse to work below a certain source temperature. The heat pump’s rated power has match the heating demands and the size of the source exactly. It also must not be oversized in order to avoid turning on and off the compressor too often.

Thus you need good estimates for peak heat load and yearly energy needs, and models should include extreme weather (‘physics’) but also erratic users’ behaviour. The more modern the building, the more important spikes in hot tap water usage get in relation to space heating. A vendor of wood pellet stoves told me that delivering peak energy for hot water – used in private bathrooms that match spas – is a greater challenge today than delivering space heating energy. Energy certificates of modern buildings take into account huge estimated solar and internal energy gains – calculated according to standards. But the true heating power needed on a certain day will depend on the strategy or automation home owners use when managing their shades.

Typical gas boilers are oversized (in terms of kW rated power) by a factor of 2 or more in Germany, but with heat pumps you need to be more careful. However, this also means that heat pump systems cannot and should not be planned for rare peak demands, such as: 10 overnight guests want to shower in the morning one after the other, on an extremely cold day, or for heating up the building quickly after temperature had been decreased during a leave of absence.

The nerdy answer is that a smart home would know when your vacation ends and start heating up well in advance. Not sure what to do about the showering guests as in this case ‘missing’ power cannot be compensated by more time. Perhaps a gamified approach will work: An app will do something funny / provide incentives and notifications so that people wait for the water to heat up again. But what about planning for renting a part of the house out someday? Maybe a very good AI will predict what your grandchildren are likely to do, based on automated genetics monitoring.

The challenge of simulating human behaviour is ultimately governed by constraints on resources – such as the size of the heat source: Future heating demands and energy usage is unknown but the heat source has to be sized today. If the system is ‘open’ and connected to a ‘grid’ in a convenient way problems seem to go away: You order whatever you need, including energy, any time. The opposite is planning for true self-sufficiency: I once developed a simulation for an off-grid system using photovoltaic generators and wind power – for a mountain shelter. They had to meet tough regulations and hygienic standards like any other restaurant, e.g.: to use ‘industry-grade’ dishwashers needing 10kW of power. In order to provide that by solar power (plus battery) you needed to make an estimate on the number of guests likely to visit … and thus on how many people would go hiking on a specific day … and thus maybe on the weather forecast. I tried to factor in the ‘visiting probability’ based on the current weather.

I think many of these problem can be ‘resolved’ by recognizing that they are first world problems. It takes tremendous efforts – in terms of energy use or systems’ complexity – to obtain 100% availability and to cover all exceptional use cases. You would need the design heat load only for a few days every decade. On most winter days a properly sized heat pump is operating for only 12 hours. The simple, low tech solution would be to accept the very very rare intermittent 18,5°C room temperature mitigated by proper clothing. Accepting a 20-minute delay of your shower solves the hot water issue. An economical analysis can reveal the (most likely very small) trade-off of providing exceptional peak energy by a ‘backup’ electrical heating element – or by using that wood stove that you installed ‘as a backup’ but mostly for ornamental reasons because it is dreadful to fetch the wood logs when it is really cold.

But our ‘modern’ expectations and convenience needs are also reflected in regulations. Contractors are afraid of being sued by malicious clients who (quote) sit next their heat pump and count its operating cycles – to compare the numbers with the ones to be ‘guaranteed. In a weather-challenged region at more than 2.000 meters altitude people need to steam clean dishes and use stainless steel instead of wood – where wooden plates have been used for centuries. I believe that regulators are as prone as anybody else to fall into the nerdy trap described above: You monitor, measure, calculate, and regulate the things in detail that you can measure and because you can measure them – not because these things were top priorities or had the most profound impact.

Still harvesting energy from air - during a record-breaking cold January 2017

The Future of Small Business?

If I would be asked which technology or ‘innovation’ has had the most profound impact on the way I work I would answer: Working remotely – with clients and systems I hardly ever see.

20 years ago I played with modems, cumbersome dial-in, and Microsoft’s Netmeeting. Few imagined yet, that remote work will once be the new normal. Today I am reading about Industry 4.0, 3D printing, the Internet of Things, and how every traditional company has to compete with Data Krakens like Google and Amazon. Everything will be offered as a service, including heating. One consequence: Formerly independent craftsmen become preferred partners or subcontractors of large companies, of vendors of smart heating solutions. Creative engineering is replaced by calling the Big Vendor’s hotline. Human beings cover the last mile that robots or software cannot deal with – yet.

Any sort of customization, consulting, support, and systems integration might be automated in the long run: Clients will use an online configurator and design their systems, and possibly print them out at home. Perhaps someday our clients will print out their heat exchangers from a blueprint generated on Kraken’s website, instead of using our documentation to build them.

Allowing you to work remotely also allows everybody else in the world to do so, and you might face global competition once the barriers of language and culture have been overcome (by using ubiquitous US culture and ‘business English’). Large IT service providers have actually considered to turn their consulting and support staff into independent contractors and let them compete globally – using an online bidding platform. Well-known Data Krakens match clients and freelancers, and I’ve seen several start-ups that aspire at becoming the next matching Kraken platform for computer / tech support. Clients will simply not find you if you are not on the winning platform. Platform membership becomes as important as having a website or an entry in a business directory.

One seemingly boring and underappreciated point that works enormously in favor of the platforms is bureaucracy: As a small business you have to deal with many rules and provisions, set forth by large entities – governments, big clients, big vendors. Some of those rules are conflicting, and meeting them all in the best possible way does not allow for much creativity. Krakens’ artificial intelligence – and their lawyers and lobbyists – might be able to fend off bureaucracy better than a freelancer. If you want to sell things to clients in different countries you better defer the legally correct setup of the online shop to the Kraken Platform, who deals with the intricacies of ever evolving international tax law – while you become their subcontractor or franchisee. In return, you will dutiful sign the Vendor’s Code of Conduct every year, and follow the logo guidelines when using Kraken’s corporate identity.

In my gloomy post about Everything as a Service I came to the conclusion that we – small businesses who don’t want to grow and become start-ups – aspiring at Krakenhood themselves – will either work as the Kraken’s hired hands, or …

… a lucky few will carve out a small niche and produce or customize bespoke units for clients who value luxurious goods for the sake of uniqueness or who value human imperfection as a fancy extra.

My personal credo is rather a very positive version of this quote minus the cynicism. I am happy as a small business owner. This is just a single data-point, and I don’t have a self-consistent theory on this. But I have Skin in this Game so I share my anecdotes and some of the things I learned.

Years ago I officially declared my retirement from IT Security and global corporations – to plan special heat pump systems for private home owners instead. Today we indeed work on such systems, and the inside joke of doing this remote-only – ‘IT-style’ – has become routine. Clients find us via our blog that is sometimes mistaken for a private fun blog and whose writing feels like that. I have to thank Kraken Google, begrudgingly. A few of my Public Key Infrastructure clients insisted on hiring me again despite my declarations of looming ignorance in all things IT. All this allows for very relaxed, and self-marketing-pressure-free collaborations.

  • I try to stay away, or move farther away from anything strictly organized, standardized, or ‘platform-mediated’. Agreements are made by handshake. I don’t submit any formal applications or replies to Request for Proposals.
  • “If things do not work without a written contract, they don’t work with a contract either.”
  • I hardly listen to business experts, especially if they try to give well-meant, but unsolicited advice. Apply common sense!
  • Unspectacular time-tested personal business relationships beat 15 minutes of fame any time.
  • My work has to speak for itself, and ‘marketing’ has to be a by-product. I cannot compete with companies who employ people full-time for business development.
  • The best thing to protect your inner integrity is to know and to declare what you do not want and what you would never do. Removing the absolute negatives leaves a large area of positive background, and counter the mantra of specific ‘goals’ this approach lets you discover unexpected upsides. This is Nassim Taleb’s Via Negativa – and any career or business advice that speaks to me revolves around that.
  • There is no thing as the True Calling or the One and Only Passion – I like the notion of a Portfolio of Passions. I think you are getting to enjoy what you are learning to be good at – not the other way around.
  • All this is the result of years of experimenting in an ‘hyperspace of options’ – there is no shortcut. I have to live with the objection that I have just been lucky, but I can say that I made many conscious decisions whose ‘goal’ was to increase the number of options rather than to narrow them down (Taleb’s Optionality).

So I will finally quote Nassim Taleb, who nailed as usual – in his Facebook post about The New Artisan:

Anything you do to optimize your work, cut some corners, squeeze more “efficiency” out of it (and out of your life) will eventually make you hate it.

I have bookmarked this link for a while – because sometimes I need to remind myself of all the above.

Taleb states that an Artisan …

1) does things for existential reasons,
2) has some type of “art” in his/her profession, stays away from most aspects of industrialization, combines art and business in some manner (his decision-making is never fully economic),
3) has some soul in his/her work: would not sell something defective or even of compromised quality because what people think of his work matters more than how much he can make out of it,
4) has sacred taboos, things he would not do even if it markedly increased profitability.

… and I cannot agree more. I have lots of Sacred Taboos, and they have served me well.

Other People Have Lives – I Have Domains

These are just some boring update notifications from the elkemental Webiverse.

The elkement blog has recently celebrated its fifth anniversary, and the punktwissen blog will turn five in December. Time to celebrate this – with new domain names that says exactly what these sites are – the ‘elkement.blog‘ and the ‘punktwissen.blog‘.

Actually, I wanted to get rid of the ads on both blogs, and with the upgrade came a free domain. WordPress has a detailed cookie policy – and I am showing it dutifully using the respective widget, but they have to defer to their partners when it comes to third-party cookies. I only want to worry about research cookies set by Twitter and Facebook, but not by ad providers, and I am also considering to remove social media sharing buttons and the embedded tweets. (Yes, I am thinking about this!)

On the websites under my control I went full dinosaur, and the server sends only non-interactive HTML pages sent to the client, not requiring any client-side activity. I now got rid of the last half-hearted usage of a session object and the respective cookie, and I have never used any social media buttons or other tracking.

So there are no login data or cookies to protect, but yet I finally migrated all sites to HTTPS.

It is a matter of principle: I of all website owners should use https. Since 15 years I have been planning and building Public Key Infrastructures and troubleshooting X.509 certificates.

But of course I fear Google’s verdict: They have announced long ago to HTTPS is considered a positive ranking by its search engine. Pages not using HTTPS will be tagged as insecure using more and more terrifying icons – e.g. http-only pages with login buttons already display a striked-through padlock in Firefox. In the past years I migrated a lot of PKIs from SHA1 to SHA256 to fight the first wave of Insecure icons.

Finally Let’s Encrypt has started a revolution: Free SSL certificates, based on domain validation only. My hosting provider uses a solution based on Let’s Encrypt – using a reverse proxy that does the actual HTTPS. I only had to re-target all my DNS records to the reverse proxy – it would have been very easy would it not have been for all my already existing URL rewriting and tweaking and redirecting. I also wanted to keep the option of still using HTTP in the future for tests and special scenario (like hosting a revocation list), so I decided on redirecting myself in the application(s) instead of using the offered automated redirect. But a code review and clean-up now and then can never hurt 🙂 For large complex sites the migration to HTTPS is anything but easy.

In case I ever forget which domains and host names I use, I just need to check out this list of Subject Alternative Names again:

(And I have another certificate for the ‘test’ host names that I need for testing the sites themselves and also for testing various redirects ;-))

WordPress.com also uses Let’s Encrypt (Automattic is a sponsor), and the SAN elkement.blog is lumped together with several other blog names, allegedly the ones which needed new certificates at about the same time.

It will be interesting what the consequences for phishing websites will be. Malicious websites will look trusted as being issued certificates automatically, but revoking a certificate might provide another method for invalidating a malicious website.

Anyway, special thanks to the WordPress.com Happiness Engineers and support staff at my hosting provider Puaschitz IT. Despite all the nerdiness displayed on this blog I prefer hosted / ‘shared’ solutions when it comes to my own websites because I totally like it when somebody else has to patch the server and deal with attacks. I am an annoying client – with all kinds of special needs and questions – thanks for the great support! 🙂

Mr. Bubble Was Confused. A Cliffhanger.

This year we experienced a record-breaking January in Austria – the coldest since 30 years. Our heat pump system produced 14m3 of ice in the underground tank.

The volume of ice is measured by Mr. Bubble, the winner of The Ultimate Level Sensor Casting Show run by the Chief Engineer last year:

The classic, analog level sensor was very robust and simple, but required continuous human intervention:

Level sensor: The old way

So a multitude of prototypes had been evaluated …

Level sensors: The precursors

The challenge was to measure small changes in level as 1 mm corresponds to about 0,15 m3 of ice.

Mr. Bubble uses a flow of bubbling air in a tube; the measured pressure increases linearly with the distance of the liquid level from the nozzle:

blubber-messrohr-3

Mr. Bubble is fine and sane, as long as ice is growing monotonously: Ice grows from the heat exchanger tubes into the water, and the heat exchanger does not float due to buoyancy, as it is attached to the supporting construction. The design makes sure that not-yet-frozen water can always ‘escape’ to higher levels to make room for growing ice. Finally Mr. Bubble lives inside a hollow cylinder of water inside a block of ice. As long as all the ice is covered by water, Mr. Bubble’s calculation is correct.

But when ambient temperature rises and the collector harvests more energy then needed by the heat pump, melting starts at the heat exchanger tubes. The density of ice is smaller than that of water, so the water level in Mr. Bubble’s hollow cylinder is below the surface level of ice:

Mr. Bubble is utterly confused and literally driven over the edge – having to deal with this cliff of ice:

When ice is melted, the surface level inside the hollow cylinder drops quickly as the diameter of the cylinder is much smaller than the width of the tank. So the alleged volume of ice perceived by Mr. Bubble seems to drop extremely fast and out of proportion: 1m3 of ice is equivalent to 93kWh of energy – the energy our heat pump would need on an extremely cold day. On an ice melting day, the heat pump needs much less, so a drop of more than 1m3 per day is an artefact.

As long as there are ice castles on the surface, Mr. Bubble keeps underestimating the volume of ice. When it gets colder, ice grows again, and its growth is then overestimated via the same effect. Mr. Bubble amplifies the oscillations in growing and shrinking of ice.

In the final stages of melting a slab-with-a-hole-like structure ‘mounted’ above the water surface remains. The actual level of water is lower than it was before the ice period. This is reflected in the raw data – the distance measured. The volume of ice output is calibrated not to show negative values, but the underlying measurement data do:

Only when finally all ice has been melted – slowly and via thermal contact with air – then the water level is back to normal.

In the final stages of melting parts of the suspended slab of ice may break off and then floating small icebergs can confuse Mr. Bubble, too:

So how can we picture the true evolution of ice during melting? I am simulating the volume of ice, based on our measurements of air temperature. To be detailed in a future post – this is my cliffhanger!

>> Next episode.