Open vs. authenticated only access
I really liked the idea of providing open comment access to pages here. But today some spammer found the site and I had to disable it .
Comments are still welcome, but one have to authenticate first.
Anonymous comments are moderated first, anonymous edit are not allowed. It is should be easy to authenticate with with openid.
Non-Puppet best practice
A friend of mine asked me how I use puppet and what I consider best practice.
He thinks I still use puppet as
I gave a talk about it 6 years ago (in Hungarian).
(Oh, that was quite a while ago )
At that time I really thought Puppet is the way to go, but a few years ago I really did not have time to keep puppet up and running as I was reassigned at work to another organizational unit and many thing changed at once.
About a year ago I revisited the idea of reviving/redeploying puppet, but as my thinking changed recently I started to feel more disgust as I tried to use it again.
There are usability problems, upgrade incompatibilities, receipts requiring newer puppet etc... I will not go into the details as Martin summarized it quite well and I agree with him.
Instead of puppet I use slaughter now. Only for some basic things like:
- deploying configuration files
- installing packages
- adding all admins with ssh keys to all machine
and I did not yet deployed it to all machine.
Still I feel confident because of Perl. There is no hidden magic involved. No DSL to learn. It has less power, but much more easier at the same time.
General ideas, may work with puppet...
My idea (that requires some work in Slaughter) is to use some external database/data structure and using that instead of defining the same multiple times.
What do I mean? I do not want a fancy way to configure DNS, Nagios, Graphite, PF and defining the same over and over again.
Instead define it once "here in my public http service running on host myhost.domain port 80" then the configuration management should generate a
- DNS record for the host (if it does not have already)
- configure the firewall to let port 80 throu
- install apache on the host
- monitor the service
Puppet has a mechanism (module?) called Hiera that lets you do this. I think with Chef this is built-in. (Chef is so hard to start with that I did not dared to try it at all.)
Icinga told me that HTTPS was failing on one of my servers.
I tried to restart apache, but it did not help. There were some SSH attempts (obviously automated bots) in the logs, but nothing serious.
Oh, wait. Only HTTPS is failing, HTTP is fine. Okay I just filtered ssh from the outside network and now everything works as expected :).
HTTPS actually worked, but very slowly, because SSH attempts delepeted the available entropy.
This will be a long post about an interesting topic. (packaging related, not sure if it is interesting to everybody)
Getting more involved in Debian
I had this idea of getting more involved in Debian. I use Perl already so I thought debian-perl can be a good place to start.
I am especially interested in Mojolicious as it is a great Perl web framework
I like to use to create a web UI for accessing administrative tools (sysadmin related).
libmojolicious-perl (the debian package) is more than half a year old in debian so I tried to update it to
Debian Perl team's repack.sh
The Debian Perl team created a tool for their own use as this happens with
- many packages
- has to be done every time a new upstream release becomes available
They have a nice documentation about packaging and repacking (removing undesired or undistributable parts):
I can create packages quite easily, but I tried to do it in a Debian way as this can also help the package maintainers. OTOH: using these standard tools worth it in the long run.
The problem is... (or just go to the technical details)
repack.sh is invoked by uscan from the watch file.
The problem is that the repacking mechanism interacts badly with
imports the original package and not the one with some files removed. This is because uscan unaware that repack happened.
However it is possible for
git-import-orig to obtain this information from uscan output.
% uscan --dehs <dehs> <package>libmojolicious-perl</package> <debian-uversion>2.98+dfsg</debian-uversion> <debian-mangled-uversion>2.98</debian-mangled-uversion> <upstream-version>3.68</upstream-version> <upstream-url>http://search.cpan.org/CPAN/authors/id/S/SR/SRI/Mojolicious-3.68.tar.gz</upstream-url> <status>Newer version available</status> <target>libmojolicious-perl_3.68.orig.tar.gz</target> <messages>Successfully downloaded updated package Mojolicious-3.68.tar.gz and symlinked libmojolicious-perl_3.68.orig.tar.gz to it</messages> <messages>Executing user specified script: sh debian/repack.stub --upstream-version 3.68 ../libmojolicious-perl_3.68.orig.tar.gz; output: Repackaging ../libmojolicious-perl_3.68.orig.tar.gz removed `lib/Mojolicious/public/js/jquery.js' *** ../libmojolicious-perl_3.68+dfsg.orig.tar.gz ready</messages> </dehs>
Try to patch git-buildpackage
The line which start with '***' contains the repacked filename so
git-import-orig can learn about it!
I submitted a patch for git-buildpackage (#635920).
To tell the truth I know that this is not the best solution, but there was already an open bugreport and it seemed convenient.
Okay then patch uscan to handle repack.sh better
I was told to try fixing / enhancing
So I added a special case if the invoked script is
debian/repack.stub and asked Gregor Herrmann on #debian-perl.
He noted that he has doubts that adding a special case for repack is the right thing to do.
I told him that
uupdate is already handled special.
OTOH if repack.stub (this is what finds and executes the real repack.sh) is handled in a special way then if we want a robust interface repack should be included in devscripts (uscan is part of devscripts) We (Gregor Herrmann and me) agreed on this.
What about adding native repack functionality to uscan?
This was my next thinking: why stop there? just improve uscan to know about repack (there is already a repack command line switch in uscan, but that is just recompression nothing more now)
Gregor told me that this is something Andreas Tille has been working on #685787. Oh, well... I looked at it and it seemed entirely in the right direction just it wont happen before wheezy.
Now it is located in this git repo: <git://git.debian.org/git/users/tille/devscripts.git>
Try it for some perl packages
I tried how this can replace the already working functionality as used by the debian-perl team. I removed repack.stub from uscan's watch file. No more external commands as this is internal from now on. I also added the right paramenters to the copyright file. It is Files-Excluded see:
Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ Upstream-Name: Mojolicious Upstream-Contact: Sebastian Riedel <sri@removed-intentionally> Source: http://search.cpan.org/dist/Mojolicious/ Comment: jquery.js is removed (not the preferred form of modification) to create the +dfsg version. Files-Excluded: lib/Mojolicious/public/js/jquery.js
This works quite well except that it still returns the original filename in the <target> section I am about to file bugreports for this. When it is done:
- There will be a universal way of repacking packages in debian.
- The perl team can migrate to this new format
- As git-import-orig cares about the <target> output of uscan it will just work without any change
Relevant reports from bugs.debian.org
- Please merge functionality for repackaging tarballs
- devscripts: Please provide a standard way of repacking upstream tarballs
- devscripts: Enabling uscan to simply remove files from upstream source
I could have also said phising.
It happend to me yesterday (reconstructing events):
- Phising email sent to users.
- Some of them are always stupid to reply
- and they send their password of course
- or submit the google form (reporting abuse on google forms is futile, google does not care)
- Some time later something or someone (I suspect part of it is human) logs in with the password.
- They use the web interface (which can be ajax, so not sure how/if it can be automated reliably).
- They send millions of spam. (They do not hurry, but it is still recognisable. At least from the bounce rate.)
- the sent spam is retained in the outbox
The good news is that this supplies valueable metrics to setup my rate limiting policy daemon.
Ha a Realtime hálózati grafikonok előadásom slidejait keresed a HBONE 2012 -es workshopról jó helyen jársz.
Sajnos egy projektor probléma teljesen megzavart, így lehetlen volt megérteni a mondanivalómat. Ezt próbálom itt helyre tenni.
c3 noc demo
Ennek a felépítése a következő:
A backend egyszerű (és elég ügyetlen) shell scriptek rendszeres indításával gyűjt metrikákat. Az adatok memcache-be kerülnek.
Mi ez a node.js és miért jó?
A Nodejs két legfőbb tulajdonsága a nagy teljesítmény és a szerver működés (pl. TCP, HTTP). Ennek használatával könnyen lehet több helyről származó adatot tényleg realtime módon közreadni.
Néhány érdekes vagy hasznos modul (nagyon sok modul létezik):
- hook.io event emitter (pl. IRC botot írni benne nagyon egyszerű)
- node-reverse-proxy http reverse proxy
- Introduction to Node.js with Ryan Dahl (a fő fejlesztő előadása)
I am working on integrating http://smietnik.xon.pl/txtgtd/ into ikiwiki.
I made some progress, but I am not really sure how it should work. Basic functionality like generating a next actions list is already available.
[[!gtd data=""" Some test project o clean garage @ home o buy nails @ errands o nail new frame @ home """]]
- buy nails (Some test pr...)
- clean garage (Some test pr...)
- nail new frame (Some test pr...)
What do you think?
It is just crazy how long lines Shibboleth produces. I made a way to watch these logs easier (for me).
I remove the long class names (it is crazy, I mean it) and add line breaks.
This is it:
tail -f /usr/local/shibboleth-idp-2.2.0-slo9/logs/idp-process.log \ | sed 's,edu.internet2.middleware.shibboleth,SHIB,g' \ | sed 's,\] - ,\]\n ,'
I had a version that contained one liner perl, but I converted it to sed.
This will be a brief email (just to get things rolling ).
I made some improvements (and will be making more soon).
I am willing to send patches if you are interested. Some are obvious, some can be against the goals you outlined in the README "Overview" section.
I want to use Slaughter in our university. We are using perl everywhere we can, that is my motivation here.
No real modification was necessary as one can easily embed password in the url. Maybe it is worth noting in the docs, if someone wants to password protect Slaughter operations.
I added certificate checks when using https. Newer (wheezy) libwww-perl supports this. And also some code is in place to check if libwww-perl is recent enough to use this feature.
I have some small fixes like replacing the UA string. Something related to the debian build process.
I am also exploring options how to make the "library" part of slaughter distributable/updateable by the server. (I did not found this out yet.) I want to make some methods also available for freebsd and openbsd.
Tell me if you are interested!
cstamas at ppke.hu
I contacted him with the form on his website. I got a message back fast.
So far so good! Sending patches now...
URL to slaughter
This is the thing what I talk about: http://www.steve.org.uk/Software/slaughter/