Jump to content


  • Posts

  • Joined

  • Last visited

About RETAC21

  • Birthday 04/27/1971

Profile Information

  • Gender
  • Location
    Madrid, Spain
  • Interests
    Military history in general

Recent Profile Visitors

1,758 profile views

RETAC21's Achievements


Crew (2/3)



  1. You must have been asleep last year, the Russian sent hunter-killer teams in the South that ravaged Ukrainian columns in the first few days of the war. They repeatedly said this is how their helicopters are flying but when the frontline stabilised, the risk to MANPADs was too high and they switched to more conservative tactics.
  2. "in two or three months when the water has drained and the mud dried up" By then, the summer is mostly past and this flank has been secure. The Russian expect a push to Crimea on the Melitopol/Tokmak axis or Mariupol, just being sure that they aren't getting flanked in the short term is worth it.
  3. https://twitter.com/UAWeapons/status/1666096408414543875?s=20
  4. The US lost it and then rebuilt it: https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/July-August-2019/Spring-Glace-Electronic-Warfare/ https://www.c4isrnet.com/electronic-warfare/2022/10/09/jam-spoof-and-spy-us-army-looks-to-energize-electronic-warfare/ But what the West in general lacks, is the synergistic approach of the Russians, who hit everything at the same time - but who forgot to deconflict with their own signals.
  5. Well, this is the golden source mentioned pages ago, but note that it is still incomplete (there are units and formations that still crop up) but about as good as you can get. Too bad the project died and the following volumes never happened. Reactive artillery are Multiple Rocket Launchers Rocket artillery are SSMs anti-tank and comms divisions are battalions Tank training regiments would become independent tank regiments if war came, presumably manned by mobilised personnel and instructors, and if you check the Forst Zinna video you posted you have one in action, so you can guess how effective they would be...
  6. Not necessarily, some of the independent divisions were kept at a high readiness state as inmediate action units for contingencies within the MD (ie - let's invade Romania), while others were low readiness divisions intended for local defence (see the Baltic MD, each Baltic nation gets a cat III division)
  7. I think this has been quoted before: https://thedeaddistrict.blogspot.com/2021/07/hungarian-t-55-live-fire-tests.html
  8. I find odd that no one seems to have noticed these defects in peacetime, but your last, for example, should be filtered out by software
  9. Is Patriot failing? food for thought: https://twitter.com/pati_marins64/status/1664746614383075328?s=20
  10. Neither "side" is fighting the war with all means. One year plus after the start of the war, the "West" is still sending leftover from the Cold War and no new production is being contracted for anything but ammunition. On the Russian side, it's the same, you see UVZ delivering a battalion of tanks per month - just enough to put up with the monthly attrition, but not to make up for the losses at the start of the war.
  11. “an ordinary girl with an extraordinary spirit” https://indianexpress.com/photos/lifestyle-gallery/the-real-neerja-bhanot-rare-photos-and-her-story/18/
  12. Bad Russian, bad russian Here's the article, note the date: Terminator war scenario no longer a joke Tom Malinowsk 11:21, Nov 26 2012 The use of drones to kill suspected terrorists is controversial, but so long as a human being decides whether to fire the missile, it is not a radical shift in how humanity wages war. Since the first archer fired the first arrow, warriors have been inventing ways to strike their enemies while removing themselves from harm's way. Soon, however, military robots will be able to pick out human targets on the battlefield and decide on their own whether to go for the kill. A US Air Force report predicted two years ago that "by 2030 machine capabilities will have increased to the point that humans will have become the weakest component in a wide array of systems". A 2011 US Defence Department road map for ground-based weapons states: "There is an ongoing push to increase autonomy, with a current goal of 'supervised autonomy', but with an ultimate goal of full autonomy." The Pentagon still requires autonomous weapons to have a "man in the loop" - the robot or drone can train its sights on a target, but a human operator must decide whether to fire. But full autonomy with no human controller would have clear advantages. A computer can process information and engage a weapon infinitely faster than a human soldier. As other nations develop this capacity, the US will feel compelled to stay ahead. A robotic arms race seems inevitable unless nations collectively decide to avoid one. I have heard few discussions of robotic warfare without someone joking about The Matrix or Terminator; the danger of delegating warfare to machines has been a central theme of modern science fiction. Now science is catching up to fiction. And one doesn't have to believe the movie version of autonomous robots becoming sentient to be troubled by the prospect of their deployment on the battlefield. After all, the decisions ethical soldiers must make are extraordinarily complex and human. Could a machine soldier distinguish as well as a human can between combatants and civilians, especially in societies where combatants don't wear uniforms and civilians are often armed? Would we trust machines to determine the value of a human life, as soldiers must do when deciding whether firing on a lawful target is worth the loss of civilians nearby? Could a machine recognise surrender? Could it show mercy, sparing life even when the law might allow killing? And if a machine breaks the law, who will be held accountable - the programmer or manufacturer? No one at all? Some argue that these concerns can be addressed if we program war-fighting robots to apply the Geneva Conventions. Machines would prove more ethical than humans on the battlefield, this thinking goes, never acting out of panic or anger or a desire for self-preservation. But most experts believe it is unlikely that advances in artificial intelligence could ever give robots an artificial conscience, and even if that were possible, machines that can kill autonomously would almost certainly be ready before the breakthroughs needed to ''humanise'' them. And unscrupulous governments could opt to turn the ethical switch off. Of course, human soldiers can also be ''programmed'' to commit unspeakable crimes. But because most human beings also have inherent limits - rooted in morality, empathy, capacity for revulsion, loyalty to community or fear of punishment - tyrants cannot always count on human armies to do their bidding. Think of the leaders who did not seize, or stay, in power because their troops would not fire on their people: the communist coup plotters who tried to resurrect the Soviet Union in 1991, the late Slobodan Milosevic of Serbia, Hosni Mubarak of Egypt, Zine el-Abidine Ben Ali of Tunisia. Even Syria's Bashar al-Assad must consider that his troops have a breaking point. But imagine an Assad who commands autonomous drones programmed to track and kill protest leaders or to fire automatically on any group of more than five people congregating below. He would have a weapon no dictator in history has had: an army that will never refuse an order, no matter how immoral. Nations have succeeded before in banning classes of weapons - chemical, biological and cluster munitions; landmines; blinding lasers. It should be possible to forge a treaty banning offensive weapons capable of killing without human intervention, especially if the US, which is likely to develop them first, takes the initiative. A choice must be made before the technology proliferates. The Washington Post
  13. https://www.stuff.co.nz/science/7998309/Terminator-war-scenario-no-longer-a-joke
  • Create New...