Blender Solar Panel Material
This is fully procedurally created solar panel material. Borders, wires and colors can be easily adjusted.
You can use this assets under the CC BY 4.0 License.
Personal Blog about anything - mostly programming, cooking and random thoughts
This is fully procedurally created solar panel material. Borders, wires and colors can be easily adjusted.
You can use this assets under the CC BY 4.0 License.
This guide is written for RabbitMQ 3.9!
For the configuration of the RabbitMQ server we will use a rabbitmq.conf
and a definitions.json
. I stored these files in a rabbitmq
folder to keep my project folder structure clean. This setup is derived from sudos answer on StackOverflow
The rabbitmq.conf
deactivates the default guest user and tells RabbitMQ to load the definition file.
loopback_users.guest = false
management.load_definitions = /etc/rabbitmq/definitions.json
In the definition file we can define our users, vhosts and permissions.
{
"rabbit_version": "3.9",
"users": [
{
"name": "local_jobs",
"password_hash": ">>>HASH<<<",
"hashing_algorithm": "rabbit_password_hashing_sha256",
"tags": ""
},
{
"name": "adminuser",
"password_hash": ">>>HASH<<<",
"hashing_algorithm": "rabbit_password_hashing_sha256",
"tags": "administrator"
}
],
"vhosts": [
{
"name": "\/"
},
],
"permissions": [
{
"user": "local_jobs",
"vhost": "\/",
"configure": ".*",
"write": ".*",
"read": ".*"
}
],
"parameters": [],
"policies": [],
"queues": [],
"exchanges": [],
"bindings": []
}
To add the configurations to the RabbitMQ server they are added via the volumes options in the docker-compose.yml
rabbitmq:
hostname: rabbitmq
image: rabbitmq:3.9-management
command: rabbitmq-server
ports:
- "5672:5672"
- "15672:15672"
volumes:
- ./rabbitmq/rabbitmq.conf:/etc/rabbitmq/rabbitmq.conf:ro
- ./rabbitmq/definitions.json:/etc/rabbitmq/definitions.json:ro
In order to generate the password hashs I used the python script by Todd Lyons on StackOverflow
#!/usr/bin/env python3
# rabbitMQ password hashing algo as laid out in:
# http://lists.rabbitmq.com/pipermail/rabbitmq-discuss/2011-May/012765.html
from __future__ import print_function
import base64
import os
import hashlib
import struct
import sys
# This is the password we wish to encode
password = sys.argv[1]
# 1.Generate a random 32 bit salt:
# This will generate 32 bits of random data:
salt = os.urandom(4)
# 2.Concatenate that with the UTF-8 representation of the plaintext password
tmp0 = salt + password.encode('utf-8')
# 3. Take the SHA256 hash and get the bytes back
tmp1 = hashlib.sha256(tmp0).digest()
# 4. Concatenate the salt again:
salted_hash = salt + tmp1
# 5. convert to base64 encoding:
pass_hash = base64.b64encode(salted_hash)
print(pass_hash.decode("utf-8"))
Ich habe ein kleines Tool für die Bundestagswahl 2021 gebaut um nachzugucken wie oft Begriffe in den Wahlprogrammen der Parteien vorkommen.
Die Suche ermöglicht es mehrere Begriffe in Relation zueinander darzustellen.
Klima,Wirtschaft
)Klima+Umwelt
)"Klima"
)Contains:
In the last months I've spent a lot of my time researching and thinking about the ecological impact of blockchain. Bitcoin and Ethereum, the two most popular blockchain technologies, have a vast power consumption. The power consumption is not inherent to blockchain technologies, but a result of the way these blockchains are secured.
The network of computers running Bitcoin and Ethereum establish trust by making changes to the blockchain expensive. This is achieved by requiring to solve a puzzle whenever you want to append something to the chain. Miners (people who solve these puzzles) have to spend money on hardware and electricity to solve the puzzles. The difficulty of the puzzle is adjusted automatically with the number (and power) of computers trying to solve the puzzle. Without adjusting the difficulty, changes would become inexpensive and the blockchain would become vulnerable. This method of securing a blockchain is called "Proof of Work" (PoW).
There are other methods to establish trust in a blockchain network, like Proof of Stake (PoS), which require far less energy. I want to focus on Proof of Work as this is the system which currently secures the most value in the crypto currency space.
It's hard to calculate the real power consumption of a blockchain. Estimates are mostly concerned with the current hash rate (how many tries to solve the puzzle are executed each second) and the efficiency of the used hardware. It is almost impossible to know how efficient the hardware used is which leads to large ranges in the estimation. As of writing Bitcoins energy usage probably lies somewhere between 43 TWh and 477 TWh per year. In the following I will not concern myself with the efficiency of the hardware. Instead I try to derive the energy consumption of any blockchain (using PoW) from the value of the coins and how much miners are paid in block rewards.
The reason why miners do their job of building and operating mining pools is simple: they want to earn money. For each block that is found by a miner she earns some coins.
There are generally two ways miners are rewarded for their work:
First the miner can create some new coins for each found block. The block rewards can change over time, for example bitcoin started with 50 BTC per block and the reward is lowered gradually until, eventually, no new coins are added to the blockchain. This mechanism is used to create the initial pool of coins and to incentivise mining while very little transaction fees are available.
When the block reward is diminished over time transaction fees play an important role to incentivise mining. When you create a transaction to be added to the blockchain a transaction fee can be added. This fee is given to the miner who created the block containing the transaction. Once no more coins are generated out of thin air the transaction fees have to pay the bills of the miners.
The block reward gives us an upper limit of the expenses of all miners. If 10 BTC are earned on average per block, all miners combined can at most spent 10 BTC to create a new block. If more resources were spent on creating a block over a longer period of time miners would go bankrupt.
Miners have two main costs: hardware and electricity, everything else is overhead. In order to be profitable they constantly try to minimize the price per computed hash. The number of blocks generated by a miner is directly proportional to the percent of the global hash rate the miner controls. A miner holding 1% of the global hash rate will be able to create 1% of the blocks. As hardware gets more efficient, in terms of hashes per energy usage, miners will upgrade their hardware to minimize their cost and to secure more blocks. When other miners upgrade their hardware too the advantage of a single miner is diminished as the improved hash rate is lost when everybody increases their hash rate. This leads to a cycle of constant upgrade to stay ahead of the curve. The upgrade cycle only leads to a higher global hash rate but does not lower the power consumption of the networks as a whole as miners only try to secure a bigger piece of the block reward cake.
If you look at mining from above, ignoring the single miner, you see a collective that constantly buys more efficient hardware but somehow manages not to reduce its power consumption. Instead the power consumption of the collective is dictated by the rewards themself.
Higher rewards for mining, either by coins becoming more valuable or more fees being paid, lead to investments into mining equipment. Existing miners buy more machines and new people start mining. This leads to a higher energy consumption of the mining collective.
Any mining reward not used for new hardware (or kept as profit) will be used to buy energy to run the machines.
This gives us four factors that control the energy usage of all miners.
The used energy of the network per block can be computed as:
([Block Reward] * [Exchange Rate] * [Proportional Energy Costs]) / [Energy Price]
= [Energy Usage per Block]
Abstracting away blocks and exchange rates this could be simplified to:
a * [Daily Rewards] / [Energy Price] = [Daily Energy Usage]
where a
is the proportion of the rewards used for energy
(The calculation assumes that a new block is generated every 10 minutes)
Revenue per Block: {{revenue}} $
Energy per Block: {{block}} GWh
Energy per Year: {{year}} TWh
The energy consumption of a PoW blockchain is proportional to the value a miner can earn per block. If the value of the reward doubles, either by higher fees or higher coin values, the energy consumption will double eventually. Similarly the power consumption will rise if the price of energy drops.
A widely used blockchain, which potentially could replace fiat currencies, would have a high transaction amount leading to a large pool of fees to be collected by miners. This would eventually lead to a corresponding energy consumption.
I've created a simple website to project and visualize the end of the COVID-19 pandemic. The page is refreshed once per day.
https://h4kor.github.io/end-of-covid/
You can select countries at the bottom of the page. Feedback, feature requests or improvements can be submitted on GitHub
Ich hab einen Datensatz der schon länger bei mir auf der Platte liegt etwas aufbereitet und veröffentlicht. Nach einem Vortrag auf der CozyConf über Datasette hab ich diesen Datensatz als Beispiel benutzt um das Tool einmal selbst auszuprobieren.
Das Ergebnis kann hier gefunden werden: https://aachen-bussgelder-k3fasuseya-ey.a.run.app/
(Das Repository: https://github.com/h4kor/aachen-bussgelder)
I'm working on a small game with a procedural map. As the whole game is set in a labyrinth-like building I've decided to use tile maps.
The map generation algorithm produces a 2D matrix with 0
s and 1
s.
A 1
indicates that the tile is walkable while a 0
represents an impassable field.
In a simple 2D setting this could be represented by using a floor sprite for each 1
and a wall sprite for each 0
.
However, this isn't very appealing.
Instead the visual representation of a tile should be depended on its neighborhood.
Each tile can have one of 256 (2^8)
configurations.
These configurations can be reduced to 41 by considering rotations as one configuration.
As multiple configurations need the same wall layout we will only need 15 tiles to render the tile map.
My first approach to implement the rendering was to use shit ton of if
statements.
The approach tested for all configurations to find the needed piece and rotation.
A few shortcuts prevented that I had to test for all 256 configurations but it was still too much.
After implementing this approach for 4 tile pieces I gave up.
My next idea was to represent the knowledge of which piece and rotation is required for each configuration in a lookup map. This approach is similar to marching cubes where you need a lookup map to know how to construct the faces of a cube. After writing down 30 entries of the lookup map I came up with the following idea (which I will call "Tile Strings").
Instead of checking for all 256 configurations, we only look at a 2x2
section of the neighborhood.
We start with the 2x2
patch in the upper left corner and determine the character it represents.
This character represents whether the left side of the tile needs a wall, a corner or needs to be empty.
The 2x2
patch is than rotated around the center to determine the characters for the other sides.
After the full rotation we have a 4 character string representing the configuration of the center tile, for example "WCCE"
.
This is our tile string.
Example code for a single 2x2
check (written in GDScript):
if cell(p + Vector3(-1, 0, 0)):
if cell(p + Vector3(0, 0, -1)) and not cell(p + Vector3(-1, 0, -1)):
tile_str += "C"
else:
tile_str += "E"
else:
tile_str += "W"
With the calculated tile string we can start to search for a suitable sprite or model in a library (our library should consist of 15 assets named by their tile string). We still have to manipulate the tile string and determine the rotation, as we most likely only want to create 15 assets.
As the tile string represents a unique configuration we only have to rotate it (at most 3 times) to find a fitting configuration in our library.
We start by checking if the tile string already fits an element in our library. If this is the case we can just use it!
If no element fits we rotate by 90 degrees, by moving the last character of the string to the front, "WCCE"
is transformed to "EWCC"
.
The number of rotations are counted.
We do this at most 3 times.
If you didn't find a fitting element after 3 rotations your library is not yet complete.
Once the rotated element fits, it can be inserted into our scene with the determined rotation.
Example code for finding an element in the library (written in GDScript):
var rotation = 0
for i in range(4):
if tile_str in library:
add_part(position, rotation, tile_str)
break
rotation += 90
cfg_str = cfg_str[3] + cfg_str[0] + cfg_str[1] + cfg_str[2]
Your library does not have to be limited to 15 elements. Maybe corridors running north-south should have a different design than east-west corridors. Just add a new element with the corresponding tile string to your library.
This approach only requires a few lines of code. There is probably a lot of potential for improvement but it works perfectly for my use case. The process of developing this once again showed me that it is worthwhile to take a step back and think about your problem. The first two approaches showed me properties of the problem and implementing them (partially) deepened my knowledge, which ultimately led to the "Tile String" algorithm.
Momentan sterben täglich etwa 800 Menschen an COVID-19. Eine Zahl unter der ich mir nichts vorstellen kann weil mir der Vergleich fehlt. Was bedeutet es wenn ~800 Menschen pro Tag sterben?
einwohner = 83_000_000
tote_pro_tag = 800
prozent_tote_pro_tag = tote_pro_tag / einwohner
print(f"{prozent_tote_pro_tag * 100:.4f}%\tder Bevölkerung stirbt pro Tag")
print(f"{prozent_tote_pro_tag * 100 * 7:.4f}%\tder Bevölkerung stirbt pro Woche")
print(f"{prozent_tote_pro_tag * 100 * 30:.4f}%\tder Bevölkerung stirbt pro Monat")
print(f"{prozent_tote_pro_tag * 100 * 365:.4f}%\tder Bevölkerung stirbt pro Jahr")
0.0010% der Bevölkerung stirbt pro Tag
0.0067% der Bevölkerung stirbt pro Woche
0.0289% der Bevölkerung stirbt pro Monat
0.3518% der Bevölkerung stirbt pro Jahr
Das ganze in Prozenten auszudrücken macht es nicht viel besser. Wie sieht es also aus wenn wir die COVID-19 Sterblichkeit auf andere Situationen übertragen?
fluggaeste_deutschland = 227_000_000 # Quelle: https://de.statista.com/themen/1060/flugpassagiere/
tote_fluggaeste_pro_tag = prozent_tote_pro_tag * fluggaeste_deutschland
print(f"{int(tote_fluggaeste_pro_tag)} Menschen würden täglich beim fliegen sterben")
print(f"{int(tote_fluggaeste_pro_tag * 7)} Menschen würden wöchentlich beim fliegen sterben")
print(f"{int(tote_fluggaeste_pro_tag * 30)} Menschen würden monatlich beim fliegen sterben")
2187 Menschen würden täglich beim fliegen sterben
15315 Menschen würden wöchentlich beim fliegen sterben
65638 Menschen würden monatlich beim fliegen sterben
Wie viele Flugzeuge müssten dafür abstürzen?
passagiere_pro_maschine = 101 # Quelle: https://de.statista.com/statistik/daten/studie/327019/umfrage/passagiere-je-flugzeug-deutsche-flughaefen/
fluege = fluggaeste_deutschland / passagiere_pro_maschine / 365
abstuerze = tote_fluggaeste_pro_tag / passagiere_pro_maschine
print(f"{abstuerze:.0f} von {fluege:.0f} Flugzeugen würden täglich abstürzen")
22 von 6158 Flugzeugen würden täglich abstürzen
zuggaeste_deutschland = 2_600_000_000 # https://de.statista.com/statistik/daten/studie/13626/umfrage/reisende-im-schienenpersonenverkehr-der-db-ag/
tote_zuggaeste_pro_tag = prozent_tote_pro_tag * zuggaeste_deutschland
print(f"{int(tote_zuggaeste_pro_tag)} Menschen würden täglich bei Zugfahrten sterben")
print(f"{int(tote_zuggaeste_pro_tag * 7)} Menschen würden wöchentlich bei Zugfahrten sterben")
print(f"{int(tote_zuggaeste_pro_tag * 30)} Menschen würden monatlich bei Zugfahrten sterben")
25060 Menschen würden täglich bei Zugfahrten sterben
175421 Menschen würden wöchentlich bei Zugfahrten sterben
751807 Menschen würden monatlich bei Zugfahrten sterben
Wie viele Zugunglücke wären es pro Tag?
kapazitaet = 468 # Intercity 2 https://de.wikipedia.org/wiki/Intercity_2_(Deutsche_Bahn)
zuege = zuggaeste_deutschland / kapazitaet / 365
ungluecke = tote_zuggaeste_pro_tag / kapazitaet
print(f"{ungluecke:.0f} von {zuege:.0f} Zügen würden täglich verunglücken")
54 von 15221 Zügen würden täglich verunglücken
Es ist schrecklich wie viele Menschen gerade an COVID-19 sterben. Wenn man diese Werte in Relation setzt wird einem das Ausmaß erst richtig bewusst. Niemand von uns würde in einen Zug oder Flugzeug steigen wenn es genau so gefährlich wäre wie es gerade in dieser Pandemie ist.