Alleycat News


On this day of September 29, 2022

So I have a new version of the 8kun Bread Launcher ready to go and a new app called Meme Finder. Both of these use the new "shadow port" feature on the Kraker Local Proxy Server. The last version (2e) of the Bread Launcher works fine from a local file (with stated limitations) and that won't change for the next version. The Meme Finder will absolutely require the proxy server although you won't be required to proxy all of your browser activity through it (but you should because it's great for DNS). The Finder comes with a really nifty image gallery viewer and will handle any number of folders as long as you don't have too many memes in any one folder (you should be fine with a few thousand but 10,000 is crazy).

I discovered that, contrary to my expectations, the last version of the Bread Launcher does not work on Chrome-based browsers (I use Brave for testing). Works just fine on Firefox and that's what I use. Mozilla isn't a bitch trying to piss all over developers like the Chrome guys. I don't know how much influence Google has on Chrome development but I'm certainly not willing to give them the benefit of the doubt. It smells like Google has something up their sleeve with this obsession with third-party cookies.

But I'm getting ahead of myself. The problem with third-party cookies is that a lot of websites use them indiscriminately to track people around the Internet. I'm not seeing what the big deal is since browsers are already blocking trackers of all kinds so why don't they just block these suspicious cookies? Instead of doing that, Chrome is going for the big guns and killing third-party cookies altogether. They've been talking about this for a while but I haven't paid any attention because I didn't really care. Now I care because it's causing me grief with the Bread Launcher and the cookies that 8kun is using for the captcha. Firefox is not fussy about third-party cookies so it works fine but Brave is being a bastard. I wouldn't mind so much if I could use a site permission to get around the issue but, nah, no soup for you. Why not?

The whole motivation for developing the proxy server for Alleycat Player is the inability to flag local files as trusted. I wouldn't even mind if the permission had to be on the basis of individual files because Alleycat is just one damn file anyway. There should be a checkbox somewhere that says "let this app do whatever it damn well pleases". Drop all restrictions and let the app fly. After all, if I build a desktop app, there would not be any such restrictions. Why can't I have free rein on my browser? Why do the browser makers have to be so bitchy about it?

I'll tell you why. They think you are too stupid to be trusted. They think you'll get scammed by some Nigerian prince with a money problem and that he'll convince you to install something that fucks up your system. What else could it be? I can't have a checkbox for my apps because they think I'm stupid. Quite frankly, I don't care all that much if other people use my apps. I just want to build them how I want and not have to bend over backwards because of browser restrictions. This is the sort of shit that leads to workarounds like extensions (which they want to block too) and my Kraker Local Proxy Server which wouldn't be needed at all except that I can't have that checkbox.

Enough ranting. I love my proxy server and I don't want to be without it. Whatever the cunts at Mozilla and Google decide to do, I'll find a workaround. Anyway, the issue with third-party cookies being blocked on Brave means that full functionality isn't possible without faking an 8kun.top domain using a shadow port. I call it get.8kun.top and Brave thinks it's a real 8kun domain and so it doesn't block the cookies. It's pretty trivial to set up the shadow ports (one is also needed for sys.8kun.top). I had to make a fix in the proxy server for the cookie problem and now there is a new fix needed for proper Youtube playback because Brave doesn't allow access to "localhost" under the 8kun domain (another thing that Firefox is lax about).

In order for all of this to work, the browser needs to proxy all content through the proxy server. It's the only way. The bonus is that you get a great DNS manager and other features as part of the deal so it's not like you'll be tied down to the proxy just for the sake of using one or two of my apps. Besides that, you'll be planning for the future. Whatever happens to the Internet, I'll be here countering the totalitarian tendencies of our overseers.


On this day of August 1, 2022

I just completed the final touches on the documentation for the Kraker Local Proxy Server. Huge sigh of relief because I won't be having to work on the docs anymore. I don't much enjoy writing docs because I'm not a writer and I have trouble finding the right words sometimes (and I'm never sure that I've succeeded). Anyway, I ended with a little rant about Cloudflare and their goddamn Bot Fight Mode. Here it is:

I hate (HATE!) the Cloudflare bot protection. In the case of an HTTP connection, it is no protection at all, really. I discovered that Cloudflare looks at certain header names for proper case usage. The affected headers are: Host, User-Agent, Accept, Accept-Encoding, Accept-Language and Connection. I wrote some code to correct these headers and put them at the beginning of the header stack.

Header names are converted to lower case by Node.js and this makes sense since the HTTP specifications state that case in header names is not significant. It also makes my code easier to write since case does not need to be considered. A server is not supposed to reject an HTTP transaction based on the case of header names but we're not talking about normality anymore. It is war out there and Cloudflare is determined to win no matter what sort of inconvenience that incurs. Ignore the specs? Sure, why not? I shouldn't be having to do this.

My main target is https://banned.video. This is an Infowars site and it gave me a problem a while back until I discovered an alternative domain (Infowars has a lot of them). Without fixing the headers, the HTTP version of the site simply returns a status 403 and no data. With the header fix, the site will relocate to the HTTPS version. That's progress since I can access another unrelated site that doesn't mind doing HTTP. However, I cannot get into the HTTPS version of that other site. I get stopped dead with not even a bot challenge to solve.

The problem with HTTPS is the negotiation (or TLS handshake) that is needed to establish an encrypted connection. The negotiation protocol is open-ended, meaning that there are a million ways to do it. This means that the specifics of the negotiation can be used to "fingerprint" the incoming connection. In much the same way, browsers can be fingerprinted based on the details of HTML rendering. So Cloudflare takes a fingerprint of the TLS handshake and will reject the connection if the handshake doesn't look like it is coming from a web browser. This is a big deal because it locks out Node.js and, really, any tool which is not based on browser code. It locks out non-transparent proxy servers so banned.video cannot be accessed through a corporate proxy and I don't know why Infowars doesn't care about that customer base. For that matter, I don't understand what the problem is in the first place. It's been like this for two years but I never heard of a bot attack on Infowars. I'm sure they get attacked from time to time but sheesh.

There are actually very few sites that do this. Most of the ones that I have seen are pirate video sites. That seems to be a pattern since Banned Video is also a video site. It suggests that web scraping might be the issue and not any kind of attack. Whatever. I can't figure out how to get into the site. There is no bot challenge, just a hard stop. There is currently no tool to modify the TLS handshake in Node.js and this seems to apply across the board though it apparently can be done with Golang (also known as Go). In any case, I'm not inclined to bother with trying to solve the Cloudflare bot challenge since I'm not even getting that far with the sites that I want to hack. Whatever. This is The End for now.

The HTTP Toolkit guy has more info here. (Note: his advice on bypassing the fuckery is outdated. Unfortunately.)


News item for June 13, 2022

Kraker v4b is now available. This is a code fix (see my June 11 rant). It is a truly powerful proxy server that goes way beyond the needs of Alleycat Player (which still does not require a Kraker version later than the first one). The Kraker proxy is a product that now stands on its own merit and not just as an addon for the player. It is likely the most powerful proxy server designed for personal use. I just need to figure out how to attract some attention to it. As it is, very few people know or care. Well, that seems to be true for all of my software projects since I have started working on them.

Funny thing happened. There are two projects in my archive.org repository that suddenly got a lot of attention without any action on my part. The Black Box project (old BBC TV program) has 5 stars and almost a thousand page views. I have no bleeping idea where this interest came from. My Europa project has 15 stars and is nearing 5000 page views. The interest started around February and it might be because of a comment to a Youtube video (no idea what video). What does this mean? Should I get an account on Youtube and start posting comments? Like where should I post to get traffic to this site? Meh. I don't relish the idea of running around promoting this. Fame isn't my goal and I'm not sure that I really want to manage a horde of curious people who don't have the mental capacity to understand what I'm offering. Not that I think people are really stupid. It's just that most of them make no effort to understand something before trashing, commending or criticizing. I'm not a cat herder.


On this day of June 11, 2022

Time for a rant. This is about Node.js which is the development tool that I use to implement the Kraker Local Proxy Server. I have one favourable comment about this tool: it works flawlessly once you figure out what to do and what not to do. So far, no mysterious crashes or hangups that can't be traced to something that I did "wrong". On the other hand, the documentation is FUBAR (fucked up beyond all repair). Half of the specs don't actually apply, perhaps because the tool has gone through too many revisions. Event handling is the biggest headache. Also, it is difficult to understand where memory leaks come from because important stuff like proper socket destruction is barely touched on. You are given the impression that everything just works so you don't need to worry your little head with picayune details like "did I close that socket properly or not?". The biggest snafu is the HTTP Request module. What a mess. Important revisions took place with Node.js v13 and v16 that aren't documented. You're just supposed to know about them by following the developer forum, I guess.

These revisions are anything but minor. They are code-breaking and figuring out how to get around that can involve a lot of experimentation to find the differences so that the code can be reworked to compensate. With v13, the timing of some events was changed. When an HTTP transaction is passed down from the HTTP server, there are two forks: the request fork and the response fork. The first one receives the headers and data FROM the caller while the second sends the headers and data back TO the caller. It's a good division of labour and I'm not complaining about that. The difficulty lies in knowing when one or the other has completed processing. This is what events are for. They signal when something important has occurred. Now, don't get me wrong. The changes in v13 are justified and I'm surprised that it took so many versions between 10.0.0 and 13.0.0 to figure this shit out.

The important difference for me is that the response fork emits a "close" event in v10 only when the response is aborted (usually because the destination server closed the socket). In v13, the event is emitted always when the response completes. Big difference if your code is relying on the event to detect an aborted as opposed to a completed transaction. Another change occurred in v16. An aborted HTTP transaction no longer closes the underlying sockets. This is a biggie and I can't understand why they made such a change so late in the game. This behavioural change should have been made optional because it is abso-fucking-lutely code-breaking. I wasted several hours of my time trying to figure out an elegant solution to this which won't break compatibility with earlier versions. My code is designed to be compatible with all Node.js versions from v10 up to the latest version. I need to carefully craft the code to handle any discrepancies between versions. I refuse to be put in the situation of having to inform my users that, hey, you should upgrade your copy of Node.js because I'm too lazy to fix my code. I just don't operate that way unless I can find no other alternative.

End of rant. I'm quite pleased with the final outcome. I have finally rid the code of all traces of memory leakage (crosses fingers). My goal of maintaining a memory footprint of around 15MB seems to have been achieved. Now, this is actually pretty important. I have read many complaints about excessive memory usage and I have no doubt that this is due to a lack of understanding of how Node.js handles events and socket closures. The horror stories of Node.js apps consuming hundreds of megabytes can be traced to these memory leaks. It's the same problem that I've been struggling with over the last few versions of my proxy server app. When I first built the proxy, it was only for the sake of Alleycat Player so the actual usage was low and thus the opportunity for memory leaks was low also. With the inclusion of the Socks5 proxy, DNS support and other features, I now have ALL activity from my web browser streaming through the proxy server. Memory leakage is therefore a more important issue than it was.

Granted, the browser consumes 20 to 50 times the memory but I'm a stickler for perfection. Anyway, if you're a budding amateur developer who has dabbled with or is planning to dabble with Node.js then you should think seriously about carefully studying my code. It might help you to understand what you're doing wrong before you do it. Good luck and please don't hesitate to drop me an email with any questions you may have. After all, what is the point of doing what I'm doing if it is not to help someone (hopefully a lot of someones)?


News item for June 9, 2022

Oh my goodness! The DMCA notice references the blobs in my repository rather than the canonical paths. What this means is that modifying or deleting the files makes no difference. I would need to install a tool to prune the GitHub history tree to removed the blobs. If you don't understand what I'm talking about then join the club. I ended up deleting the entire repository and rebuilding it because that was easier than figuring out how to delete the history or the blobs or whatever it is I was supposed to do. I hope this doesn't happen again.


News item for June 7, 2022

Wow! Got hit with a DMCA notice today. This affects versions of Alleycat Player from v4b to v5b. Corus Entertainments doesn't like having their free Global News feeds passed around the Internet. A great many IPTV playlists are affected all over GitHub. For the time being, I have removed all of the affected copies of Alleycat except for v5b from which I have removed the Global News link. I'm thinking of using a simple algorithm to obfuscate the link so that Corus won't be able to find it again. They're obviously just using a simple string search to find the links. Obfuscation won't work for all of those playlists though. The DMCA notice lists a few hundred. Geez, what a needless headache for the GitHub management.

The DMCA notice is here: Plain Text or Rich Text Format


News item for May 15, 2022

Oh man! So long since I've typed something here. I just finished a massive update on the Kraker Local Proxy Server instruction manual. There is a LOT to document with the new update to version 4a (and I still have a few more things to cover but this is enough for now). Kraker isn't just a side tool for Alleycat Player anymore. It's now in the big leagues and deserving of some kind of an award. If I do say so myself, kek.

Anyway, I'm tired right now but I'll be back soon to post my thoughts. Stay tuned, fren.


News item for November 17, 2021

Alleycat Player will be two years old on December 7. Hard to believe that it's been that long. As noted on the main page, the installation and user manuals have been updated. This was way overdue and I've had to really drag myself to the task after leaving it aside for over a year. Now it's done and I can kick back with quart of tequila and celebrate all the hard work I've put into this. Hope somebody appreciates it. Later.


News item for October 13, 2021

I had to upload a new update for the 8kun Bread Launcher because I found some issues with Chrome-based browsers. The problem is that Chrome is a dog at opening new tabs and preparing the DOM. It reports that it is ready when it is not. This forces me to place extra checks and timeouts to make sure that the DOM is actually ready to be modified correctly. If I don't do this, I get the wrong values for things like the width or height of an element. This causes two problems when opening a new tab: the title in the status bar may not appear and images may be placed on the wrong side (the right instead of the left).

Another thing is that Chrome is stubborn about timers running in a page. It wants to slow them down when the page is in the background. That's fine. I understand how some web developers will abuse timers. However, a timer should only be slowed down if it consumes a lot of CPU. The auto-update timer in the Launcher does not consume CPU until it hits zero. There is no good reason to slow it down but Chrome is dumb and stupid about it. Anyway, the solution is to move the timer out of the main page and into the daughter page (because the main page is running in the background). It's just one line of code. Chrome will only slow it down if the daughter page goes to the background. Which is still wrong but it is at least tolerable.


News item for October 9, 2021

I am pleased to announce that the 8kun Bread Launcher finally supports posting. This is just a preliminary update. The posting feature is not yet full-featured but that'll come in the next week or so. Barring head-scratching complications, of course, and I've had a lot of them with this update. HTML and CSS never cease to present me with browser compatibility problems. For example, it turns out that Chrome browsers put a little blank space below the <textarea> element that should not be there. I had to hit the Internet to find a solution which was to add the setting "vertical-align:top" to the CSS for the element. Really??? Why do these things happen? Oh well. At least we are past the bad old days of Internet Explorer and Netscape.

Something is totally buggered with the "XMLHttpRequest" component. This is used to implement the posting operation. I could have used "fetch", which is what I would normally do, but I wanted to implement an upload progress indicator and that is not possible with "fetch". I got the indicator working but it doesn't actually work. I know it doesn't work in Firefox and I haven't bothered to test with the other browsers but my research on the net seems to indicate that this is a problem all around. The problem is with the "onprogress" callback which doesn't fire at regular intervals and often indicates the wrong amount of progress. I mean, it'll jump to 62% done right off the bat and then not fire again. Shit like that. Oh well, at least the uploading works.


On this day of September 17, 2021

I encountered a peculiar artifact while I was revamping the HTML for the Alleycat main page. My goal was to keep the text in a panel separate from the image in the same panel such that, if you zoom in (or if the user forces a larger font), the text won't wrap around the image as the panel expands vertically. Also, the image will remain centered. The only practical way to achieve such is to use a table format. An odd thing happened due to the inclusion of a table inside the panel. A 5-pixel-tall space appeared below the bottom border of each panel.

It was not my CSS that caused this spatial displacement. There was absolutely no reason for it and the artifact appeared in different browsers. In Firefox, the spacing varied by a pixel (arrgh). Originally, the panels were constructed with a <div> tag so I simply built the table inside the div. Removing the div and transferring the CSS to the <table> tag cleared up the problem. I have no idea what went wrong but I guess I should be happy that it is now fixed.

I have a similar issue with Alleycat Player and it may be due to the way the table format conflicts with something else in the video viewers. All three viewers should be displayed exactly the same but they are not. There is one out of the three which has a slight difference somewhere. This varies with different browsers. It's been months and I still can't figure it out. Grrrrr.

Addendum: After writing the above, I discovered something else. The problem was caused by declaring "display:inline-block" in the CSS for the div. The artifact goes away with "display:block". Alternatively, the declaration "vertical-align:middle" also works. Why???


News item for September 15, 2021

The 8kun Bread Launcher has been updated in response to a recent change to the media server on 8kun. The media server went down at 3PM on August 26 and was restored on September 13 after a downtime of 18 days and 4.5 hours. The new domain for the media server is "images.128ducks.com". This condition is only temporary. Who knows how temporary?