I’ve got a complain from the customer of my recent cPanel server installation that his DNS zones are not replicated by our main DNS servers and therefore they have problems.
I’ve got a complain from the customer of my recent cPanel server installation that his DNS zones are not replicated by our main DNS servers and therefore they have problems.
On one of my recently installed servers I’ve got customer complain about ffmpeg missing some obscure shared libraries. I went to investigate and met some interesting case
Read more »
I had a task to automate customer files backup from Windows 10 desktop to NextCloud server. The main idea to keep offline backup that will be updated automatically few times a week (not every day). Main requirement for this was to reduce human interaction with the process as much as possible and exclude some files from backup process all together.
Initially I was not concerned about this – after all there is official Nextcloud Windows desktop client along with command line tool, ownCloud desktop client is also compatible and also has command line tool included, so I thought to myself out of these two I should be able to put together some kind of automation.
Well, to my dismay both official clients proved to be completely useless in unattended automation. Whatever webDAV protocol limitation there were, both command line clients had it, for example simple task of synchronizing single directory, say d:\testsync to remote/testsync was impossible because neither client could create 1st level folder on remote server. It could be underlying limitation or bug of Qt implementation of WebDAV protocol, but figuring that out was out of scope of my task.
I needed reliable and compact (this is not the requirement but always welcome, considering that official clients were dragging with them about 100Mb of Qt core libraries and are useless for my purposes) WebDAV client for Windows with automation and I needed it yesterday.
Then, I discovered that WinSCP supports WebDAV and allows scripting advanced enough to help me with my task.
After some testing I cam up with the following system:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
@echo off rem rem winscp starter script rem restarting winscp until errorlevel is 0 rem :again winscp.com /ini=nul /script=syncscript.txt /log=syncresult.txt if %ERRORLEVEL% neq 0 goto again echo Success exit /b 0 |
1 2 3 4 5 |
open https://<username>:<password>@nextcloud.serveraddress.here/remote.php/webdav/ synchronize remote -filemask="| *.mdf;*.ldf;*.bak;*.tmp;*RECYCLE.BIN/*; S-1*; ~$*.*; desktop.ini" c:\foldertosync\ exit |
filemask covers exclusions
This solution synchronizes about 200Gb of data from single Windows every 2-3 days. It’s pretty fast, compact and the best part of it is that it transfers only changed files. Case closed.
Why do I think that DevOps is not going to last too long as it is now widely marketed and hyped? Because there was really not that much demand for it in the first place – it was hyped and brought up by project management and some marketing people that always know better how programming and software development should be done.
I’ve seen some IT hypes rise and go
As the experience shows in 2-3-4-5 years hype dies out and most of things return back to the way they were before, may be with the slight new twist. Of course there is always some market share leftovers staying on so much hyped technologies just because big investments were made into it and these could not be unmade without some unpleasant discussions so at the end it was just decided to call this innovations a success.
What I think, that the whole DevOps deal was an attempt of some management to cut costs ™ namely make software developers do sysadmin tasks without hiring professional sysadmins and basically without salary rise. Round of applause ensued, big bonuses distributed and reports of innovations issued. Nice and shiny face of the capitalism here. New tools and technologies came out shortly after and placing DevOps into the title and somewhere on the resume will ensure recruiters interest in you.
Why do I think this is totally wrong? From my more then 20 years experience I could say that software developers and system administrators have not only different skill set but also very different mindset and way of solving problems. You know what they say – “If you are a hammer most of the problems start looking like a nail”? Software developers/programmers are set to solve a problem with writing new code or modifying an existing code. All side problems like configuring development environment, setting up networking, backups and information security are seen as an obstacles to the final and ultimate goal – software development. So you would naturally assume that performing “Op” tasks will be quick, inaccurate and well yeah mediocre in order to achieve an ability to do “Dev” part (19% of Docker images with empty root password is the nice illustration to issue). You want an example? Here you go – saving AWS credentials in github repository is widely known security problem nowdays, and guess what – in most cases that was done by some CI/CD tool or some high and mighty DevOp that was harmless and respected programmer in his previous life.
I wouldn’t say that DevOps is total pure evil in itself – there are some good ideas in it, but dumping all system administration tasks on developers usually wouldn’t lead to any good outcome. Just like widely popular tape recorders with bundled radio long time ago – both functions were way below average.
I hope that at the end the common sense will prevail.
It’s no news – everybody knows that, especially serious web surfers who have many tabs open – sooner or later if you will not restart your browser completely, no matter how much RAM do you have on your computer, Google Chrome will consume it all and start crashing – partially by killing extension processes or completely – by killing/trashing multiple tab processes. Google developers are constantly working on improvements trying to make Chrome use less RAM but so far they have mixed results.
Read more »
As I wrote before it is more convenient and secure to have something small, fast and feature limited as your default browser (valid decision for all OSes out there).
Well times go by and nice small Qtweb got outdated with development on it stopped about 6 years ago and new standards (namely SSL/TLS) and new vulnerabilities came out, so I decided
Read more »
With increasing role of HTTPS websites (Google pushing everybody to run only HTTPS websites considering regular HTTP as insecure) the service provided by Let’s encrypt becomes critically important. But there is a catch – once you get the certificate and redirect your site to HTTPS using .htaccess you will get a problem renewing certificate because 301 redirect breaks the challenge verification and the command
1 |
certbot-auto renew |
gives an error about authorization problem.
Read more »
If you are using webmin excellent system for managing virtual host configuration it would make perfect sense to integrate with it the popular certificate authority Letsencrypt that issues completely free SSL certificates.
There are few initial steps that has to be made nside Webmin in order to make it utilize Letsencrypt SSL certificate issuing process for configured virtual hosts. I have successufully configured and used Webmin version 1.831 and certbot-auto 0.12. YMMV.
Read more »
When you have multilple PHP-FPM pools configured on the server you usually store the configurations under /etc/php-fpm.d. When you have a lot of sites this directory starts looking pretty crowded.
Although when you look inside the typical PHP-FPM pool configuration file you can easily notice that there about only 4 lines that make a difference – everything else is absolutely identical.
Read more »
Vmware Workstation is the perfect candidate for configuring local PXE server for testing and development – it contains independent DHCP server that is an equivalent to ISC DHCP v2. All you need is to have your own VM that will be serving as TFTP server for network boot images, there are plenty of instruction on how to do that on Linux (.
I decided to configure my Vmware workstation for windows (windows 7 in that case) DHCP as PXE server using vmNAT network adapter, since I already have Linux VM in vmNAT network which I can configure for TFTP server.
The config file is %SYSTEMDRIVE%\Users\All Users\vmnetdhcp.conf, in order to enable PXE you will need to add 4 lines.
1 2 |
allow booting; allow bootp; |
1 2 |
next-server <Your-TFTP-VM-IP-Here>; filename "pxelinux.0"; |