Contacts

Arpanet's role in the emergence of the internet. How convenient is the Huawei Mate X in terms of thickness? And ARPA saw that it was good

Today spy software allows you to track your phone. And in the past, when you wanted to know if your husband / wife was cheating on you, or if you wanted to know the details of the cheating, you had to hire a private detective. This practice was so popular that it became common in films.

Why can cell phone spyware be better than private detectives?

Today, with the advancement of mobile technology, there is another alternative: mobile phone spyware, and nowadays many people use it. Comparing both options, cell phone spyware might be the best alternative to find out the truth about your husband or wife.


Let's talk first about the most obvious benefit when tracking a person: cost. Hiring a private investigator is probably only good if you are willing to spend a lot of money. At a few thousand rubles an hour, you really have to be careful about the hours you want to hire a private investigator. The practice of having a private detective following a person all day is common in films, but unrealistic in real life. In practice, most people hire a private detective for a few hours on key days, usually at night or on weekends. Even if you've hired a private investigator for four to five hours on a weekend, it's already a significant waste of money. And if your spouse is innocent or decided not to meet this weekend, then this money will go to waste.

Foreign versions of the spyware for a mobile phone cost from $ 60 to $ 150, and they will work for three to four months, during which it is activated at any time. In fact, the only real time limit for surveillance is your own desire to sleep.

Of course, the main criterion should be efficiency, not price. Fortunately, according to the developers, the cell phone spyware does what it is supposed to do, detects if your husband / wife is having an affair, as well as all the details of the deception. A spy on your phone even does what a private detective does not - at any time shows you where your spouse is, follow up sms. You will always know where the wife / husband is and if he / she is lying to you about where she / he is going.

Finally, there is one thing a private detective does, but cell phone spyware doesn't: a photo of your husband / wife. However, due to the difference in price, you can simply get detailed information about the places where your spouse meets with his affair, and then simply hire any photographer to shoot. In the end, the photo itself is important and there is no need for a private detective to take it.

5G chipsets from Huawei may become available for sale to third-party phone makers, and the company will say yes if the US iPhone maker wants to buy. Something like that the other day, Richard Yu Chengdong, CEO of Huawei's Consumer Business Group (CBG), told the Global Times. But, so far, IT-corporation Apple has not commented on Huawei's latest proposal.

news high tech worldwide: Apple iPhone 5G and Huawei chips for fifth generation mobile phones.

Huawei is ready to sell its 5G technology to Apple if the purchase price is right, according to a fresh report. The California-based Apple is currently in the spotlight after IT companies Huawei and Samsung successfully launched their 5G smartphones this year. And even the long-forgotten Motorola has released its Moto Z3 multimedia phone, which can connect to Verizon's 5G network.


The reason Apple lags behind its competitor in this competition is due to the lack of a 5G modem chip, the kind of device that is used to transfer data across a wide variety of physical environments.

Since the company parted ways with Qualcomm, the main chip supplier for Apple's iPhone lineup, in 2016 due to patent and licensing issues, Apple has moved exclusively to Intel to implement its 5G plan.

What is 5G?

5G is the trendiest technology in the smartphone world. It is a next generation networking technology designed to work with existing 4G networks. The new connection offers many benefits, including super-fast gigabytes per second (Gbps) transfer rates that will allow you to download an entire series in minutes.

5G technology is already working in some telecom markets, in regions such as Seoul, Korea and Chicago, USA. The new link is due to hit the UK in some locations, including London and Manchester, by the end of the year.

5G equipment manufacturers.

The news follows reports that Apple is "losing confidence" in Intel, which is currently supplying the modems used in its iPhones. Fast said Intel has been struggling to fulfill an order for demo 5G modems due this year. The hardware shortage reportedly could force Apple to postpone plans to release the iPhone 5G as planned.

But according to Intel, its 5G modem chip will not be available in the first half of 2019, and it is not known when the product will be available to buyers. Bloomberg previously reported that Apple may not have a finished iPhone 5G until 2020.

In January, Apple held talks with Samsung and MediaTek, along with existing vendor Intel, to ship modem chips in the near future, but probably no company could have the chips ready by 2020.

The question of when the iPhone 5G will be released remains open?

The manufacturer Huawei once stated that its 5G chips are only for "internal products" and will not sell its own 5G chipsets to third parties. But now, according to the CEO, "Huawei is open."

Also via Engadget, there was news that unnamed sources familiar with the matter confirm that Huawei is ready to sell its 5G Balong 5000 chipsets to Apple.

What could be news is a marked shift in strategy from Huawei, which has previously refused to sell its chips to rival phone and tablet makers.

Which phones support 5G frequency? Samsung unveiled the Galaxy S10 5G at its Unpacked event in February. LG also has a new LG V50 5G phone coming out very soon. Most new phones are set to use the Qualcomm 5G-ready X50 modem.

Many phone manufacturers on Android base have already announced plans to release a 5G phone by the end of the year. Huawei unveiled its first 5G smartphone, the foldable Mate X, at MWC 2019 in March.

The era of 5G phones has already started!

TrunCAD 3DGenerator is a very fast and intuitive software for designing, calculating and manufacturing furniture. In the future, with TrunCAD's 3DGenerator software, you will be able to highly automate your workflow steps and take advantage of the workflow benefits.

Furniture industry news: 3D furniture software has many functions, including design / modeling, calculation and production.

In the latest version of the 3DGenerator program, a new module called "Scribble" has been developed. With this module, you can immediately design the wardrobe the way you like it. You just click with your mouse on the screen. Dividers and cabinet fronts are positioned and moved with the mouse. After you finish sketching in 3DGenerator furniture program, the information is exported and displayed in 3D content. After importing "blueprints" / "scribbles" into 3DGenerator, all functions such as Partlist and CNC-export can be used directly. The Scribble module will speed up the presentation of designed furniture to your client.


Furniture planning and design using the program.

Involvement of your client directly in the planning process. Thus, it is possible to reduce the frustration of the furniture project, the client can participate in the design of the furniture. 3D Generator TrunCAD will help you design individual furniture as well as plan entire rooms. It only takes a few parameters to get a realistic 3D furniture / project model for your client. If your client needs time to make a decision, just give him a presentation of furniture and a 3D Viewer will show the work on a personal computer or laptop.

When your client is happy with the design, 3DGENERATOR calculates an offer according to your individual salary. All information is available immediately after the completion of the design. Due to the high compatibility of Truncad 3DGenerator, data can be exported to many software packages. You can use a trusted software environment and efficient furniture data transfer is guaranteed.

Complex furniture projects.

As easy as presenting furniture, just one click to create a parts list, cut lists and even CAM programs, as well as 2D and 3D geometry. You can define different inserts that can be exported as 2D-DXF drawings to almost any CAD system. A design from 3DGenerator can be exported to almost any 3D CAD system and modified. Thus, you can use Truncad 3DGenerator for complex projects to start and export data to CAM systems.

Furniture creation.

Now with TrunCAD 3DGenerator, furniture production can start immediately.

Cross-platform frameworks provide mobile developers with a complete set of tools designed to improve productivity by solving common problems. The question is which frameworks are best for you in mobile development. To help you answer this question, we have prepared a special list of cross-platform frameworks for developing high-quality mobile applications.

Developing a mobile application using a cross-platform framework is a shorter path to the successful completion of the task.

With nearly three million apps on Google Play, the Android operating system dominates mobile environment... Individuals, small businesses and large enterprises are working hard to establish a strong mobile presence and grab their market share. However, not everyone has the experience and resources needed to build a good mobile app from scratch using native tools.


The goal of frameworks is to make mobile app development as easy as possible.

List of cross-platform application development frameworks:

- Corona SDK;

Is it easy to build apps and games with the Corona SDK? The creators of the Corona SDK framework promise ten times faster game and mobile app development. How is this even possible? This is likely due to the fact that the internal structure of the Corona application is completely based on Lua, a lightweight multi-paradigmatic programming language with an emphasis on speed, portability, extensibility and ease of use.

The official Corona SDK website contains tutorials, tutorials, and examples designed to turn novice mobile application developers into experienced professionals. The guides and tips cover all kinds of developer topics. From the basics of mobile development to more advanced topics. The Corona SDK framework is completely free. Remember cross-platform. It works on both Windows and Mac OS X and supports real-time application testing.

- TheAppBuilder;

So, the description of TheAppBuilder, a framework used by some of the largest organizations in the world, is equipped with a user interface to speed up the development of application code. It has been reported that the version works best when used to create company presentations and other information applications. The framework comes with ready-made blocks for push notifications, feedback, polls, content updates, analytics and more. Best of all, TheAppBuilder integrates directly with Google Play, allowing you to publish finished applications with a single click.

- Xamarin;

The Xamarin framework was developed by the same people who made Mono, ECMA compliant, and there is a set of .NET Framework compliant tools. Xamarin offers developers a single C # codebase that they can use to build native apps for all major mobile operating systems.

Unlike many other frameworks, Xamarin has already been used by over 1.4 million developers around the world. With Xamarin for Visual Studio, developers can take advantage of the power of Microsoft Visual Studio and all of its advanced features, including code completion, IntelliSense, and debugging apps on a simulator or mobile device. The Xamarin Test Cloud feature allows you to instantly test applications on 2,000 real devices in the cloud (remotely, over the Internet). Today it is The best way deal with the severe fragmentation of the Android ecosystem and release error-free mobile apps that work without any major issues on most gadgets.

- Appcelerator Titanium;

The Appcelerator Titanium framework is part of the Appcelerator Platform, which includes all the tools a mobile app developer might need to build, test, and deploy highly optimized apps. The Titanium framework uses JavaScript to call an extensive collection of APIs. These APIs call the native functions of the operating systems for exceptional performance and a natural look and feel.

Titanium includes a visually-oriented mobile app development workflow that relies heavily on pre-built blocks of code that can be assembled using drag and drop. You can create data models programmatically or visually. Test and track off-the-shelf mobile apps in the cloud with the Mobile Lifecycle dashboard, which provides valuable insight into app performance.

- PhoneGap;

PhoneGap by Adobe is one of the world's most popular Android app development frameworks. It was created by the Apache Cordova development team. An open source mobile app development environment that uses CSS3 and HTML5, as well as JavaScript for cross-platform development. PhoneGap is also completely open source software.

It is based on an intuitive desktop application used to create applications and connect those applications to mobile devices (phones / smartphones, tablets). Finally, there are no more obscure text commands which are easy to mistake and difficult to remember. The fantastic desktop app is complemented by the PhoneGap mobile app. The application allows you to instantly see the changes on the connected mobile device. Other things that make PhoneGap so highly recommended are its large plugin library, third party tools, and thriving community.

- Ionic;

Ionic is a free and open source framework licensed under the MIT license. It offers a whole library of components and tools. Ionic allows you to develop progressive web apps and native mobile apps for every major app store - all from a single codebase. The best native plugins make it extremely easy to use features such as Bluetooth and Health Kit, and fingerprint authentication is also supported.

Ionic is also designed for performance tuning and optimization. All applications built using Ionic look like they are standardized and they work equally well. So far, nearly four million applications have been created by five million Ionic developers around the world. If you would like to join them, visit the official website and learn more about this framework.

- NativeScript;

JavaScript and Angular, as well as TypeScript, are arguably the most commonly used web development technologies. With the NativeScript framework, you can also use them to build apps. Simply put, NativeScript creates platform user interfaces from a single code base. Unlike other frameworks, NativeScript is backed by Telerik, a Bulgarian company that offers various software tools.

Looking for tutorials on building mobile apps in a cross-platform NativeScript framework? To help mobile developers familiarize themselves with this framework, the official website contains many examples and detailed teaching aids... You can view real-world implementations of mobile applications, study the official documentation, and even dive into source.

- React Native;

React Native is developed by Facebook and is used by Instagram, Tesla, Airbnb, Baidu, Walmart and many other Fortune 500 companies. Facebook's React JavaScript framework is open source. Since React Native uses the same UI building blocks as regular mobile apps for iOS and Android gadgets, it is impossible to distinguish a React Native app from an app built using Objective-C or Java. As soon as you update the source code, you will immediately see the changes in the application preview window. If you ever feel the need to manually optimize certain parts of your application, React Native allows you to combine native code with components written in Swift or Objective-C and Java.

- Sencha Touch.

Sencha Touch what is it? Like TheAppBuilder, it is an enterprise framework for building universal mobile applications. It uses hardware acceleration techniques to achieve high performance. Sencha Touch comes with five dozen built-in UI components and decent looking themes, making it easy to create stunning user-engaging apps.

The framework includes a robust data package that can consume data from any internal data source. With this package, you can create collections of data using highly functional models that offer functionality such as sorting and filtering. Sencha Touch has received praise from many influential companies and organizations.

Conclusion of the review of cross-platform frameworks for mobile application development:

Regardless of which mobile app development framework you choose, don't be afraid to change your mind if you ever feel there are better IDE options out there. Cross-platform frameworks are extremely volatile and new ones are released on a regular basis. Their goal is to help you quickly turn a rough idea into a working app and a working mobile app into a finished product. In the end, it doesn't matter if you reach your goal using the latest modern framework that everyone is talking about, or a long-established framework that is starting to collect dust.

When Chinese IT company Huawei decided to unveil its new multimedia phone surrounded by the ornate National Museum of Catalonia in Barcelona, ​​it undoubtedly gave hints to the journalists present (covering technology news) of what they would see. After all, the recently unveiled foldable smartphone Huawei mate The X looks a bit like a rare Picasso painting.

First review on Huawei Mate X: a smartphone with a folding screen - attractive, has powerful specifications and an incredibly expensive price to buy.

So what is the Huawei Mate X smartphone? The impression from the first review of the Huawei Mate X can be expressed by the phrase that this smartphone is great. Even the phrase that this is a beautiful smartphone softens the review a little. Rather, he is magnificent in his own way. It has arguably the most dignified industrial design of any mobile phone that tech giants have ever produced in the past few years. The new Huawei smartphone from contemplation and deep imagination clearly pushes the boundaries of what smartphones can be. Since the size of a smartphone screen easily turns into a tablet one. Thus, mobile content can be viewed in a convenient way for the situation.


Those who know all about phones might think that when asked for a unique price, the Mate X is a bit like Picasso's story in that it is a very expensive smartphone. The Mate X has raised the bar on smartphone prices. But perhaps given the specs on offer, it can justify its high price tag for those deciding which phone to buy.

Display on Huawei Mate X.

Which display is better? The Huawei Mate X has a single display that can be transformed into three different configurations. The first mode is an 8-inch tablet. It is a nearly perfect square with an aspect ratio of 8: 7.1 and a resolution of 2480 by 2200 pixels.

Since the screen is on the outside of the smartphone, when the mobile is folded down, you get two screens. The front screen offers 6.6 inches edge-to-edge, complemented by a 19.5: 9 aspect ratio and a 2480 by 1148 pixel resolution.

There is also a back that offers fewer inches for the screen as it contains the device's cameras and a grip. You will primarily use this part for taking selfie photos. This part delivers a decent (but thin) screen size of 6.38 inches with a slightly compressed 25: 9 aspect ratio and a resolution of 2480 by 892 dots (pixels).

How convenient is the Huawei Mate X in terms of thickness?

When mobile Huawei phone The Mate X is folded, 11 millimeters thick, and unlike its rival Samsung phone Galaxy Fold, there is no bulky gap in it. It is completely flat and locks in place with one click. It would be interesting to test how well it fixes when thrown into a purse, for example, and see if it can accidentally open or not.

When unfolded, the Mate X is 5.4mm thick, which is slightly less than the iPad Pro!

On the Huawei Mate X, the camera, the pen - everything for the user!

A quick glance at the side of the Huawei Mate X is a pen (Huawei's rather descriptive term). The device contains three mobile cameras, including one using Leica hardware. For tech news, this was not a surprise. The same configuration has appeared on all Huawei phones starting with the P20 Pro model. It would be strange if the manufacturer Huawei ditched such a feature in such a revolutionary device.

You may notice that the phone does not have a dedicated front-facing selfie camera. This is because the three main cameras are selfie cameras. To take a photo of yourself, you just need to fold your phone and flip it over.

This is all pretty exciting. Huawei's premium phones are regularly ranked as the best camera phones on the market. While the company did not share camera samples during the launch event, we can admit that some people love the ability to take selfies with a high-end mobile camera, augmented by Master AI software.

And since the back of the Mate X also contains a screen, you can use your smartphone when taking photos, for example, to show the subject of the photo a preview of how it will ultimately look in the shot.

Huawei officials say the Mate X has no camera issues. This is good news, both in terms of looks and overall durability. The latter is what the company focused on with the announcement of a specially designed protective phone case.

New 5G connectivity and performance on the Mate X.

When reviewing the Mate X, it's important to remember that Huawei isn't just a phone maker. It targets many IT areas, including SoC design. So it comes as no surprise that the Mate X uses a Balong 5G modem as well as a Huawei Kirin 980 processor.

The modem is particularly interesting, as Huawei promises performance will more than double that of competing brands such as Qualcomm Snapdragon and Samsung Exynos. It is assumed that users who can afford to buy Huawei Mate X in stores will be able to use the download speed of 4.6 Gb / s, for example, to download a 1 Gigabyte movie in just three seconds. Of course, right now, we could not independently verify this, so for now it remains to take our word for it.

What operating system is installed in Huawei Mate X?

From point of view software The Mate X is running a system Google android 9.0 Pie.

A Huawei spokesperson also said that Desktop Mode software will be available for its latest foldable phone, allowing the Mate X to be used as a smartphone, tablet, and even desktop computer.

Memory Huawei Mate X.

The Mate X is a dual SIM mobile phone with one slot supporting 5G and the other limited to 4G. If you don't need the latter function, you can simply insert the NM card (explanation, NM is nano card memory invented by Huawei, which offers the same type of memory as a microSD memory card, but in a smaller form factor) and add additional storage space in the mobile device. Wherein basic version the smartphone comes with 512 GB of memory. Even the most enthusiastic filmmakers are unlikely to use all that storage space in a mobile phone.

Rechargeable battery for Mate X.

With such big screen for the job, you'll be glad to know that the Huawei Mate X phone is released with a rather gigantic battery. The device has two cells, which in total are measured up to a respectable figure of 4500 mAh. Unfortunately, there are currently no battery tests available, so it is difficult to say how this affects the actual use of the new smartphone.

The Chinese company shared that the Mate X comes with a 55W super charge feature that can recharge your phone's battery up to 85 percent in just thirty minutes.

Pricing Huawei Mate X.

Huawei Mate X is perhaps the most important phone ever represented by an up-and-coming Chinese tech brand, and not just because it solidifies its reputation as an innovative premium phone manufacturer. This phone brings more than three years of research and development to the company and integrates advances in materials technology and communication equipment.

With that in mind, don't be surprised to find that the smartphone comes with really high prices, starting at 2,299 euros. When Huawei CEO Richard Yu (English spelling of the name "Richard Yu") broke the news, the silence of the crowd he used to enjoy was replaced by a whisper with a question. How much, how much does it cost?

Speaking of prices, it is about 300 euros more expensive than the flagship mobile device. Samsung Galaxy Fold. And this is about 800 Euros more expensive than the most expensive Apple iPhone. For the price, the Mate X is in the same range as the company's previous luxury phones, which have carried the brand of luxury car brands, namely Porsche.

Huawei hasn't forgotten about the high cost of the Mate X, and during the call, Richard Yu said the price of the phone reflects the high cost of mobile research and development. He explained that the patented hinge that separates the two displays is a three-year development process with over a hundred different parts. This kind of research and development is not cheap and it is inevitable that there will be costs.

However, two things are inevitable. First, there will be no shortage of pioneering enthusiasts willing to save a lot to raise money for a premium phone. For these shoppers, there is an undeniable charm to be among the first to have something special. Perhaps Huawei can take advantage of the news buzz and benefit from more than just selling cheaper phones.

Secondly, market prices will inevitably decline. Perhaps not for this smartphone, but certainly for foldable smartphones in general. Generally speaking, the price of 2300 Euro for the phone will be perceived as a deviation from the norm. This will be driven by several factors, ranging from inevitable savings to competition from other up-and-coming brands such as Xiaomi and OPPO, which are steadily invading the Western smartphone market.

Availability of purchase of Huawei Mate X.

For example, Huawei did not disclose how much the device will cost in the UK, but if you guess, it might cost around £ 2,300. This assumption takes into account previous price trends, high UK sales taxes and the continued decline in the pound.

Also, Huawei CEO Yu did not mention any plans to release the Mate X in the United States. Which is not surprising. The company rarely makes phones in the US. The Mate 20 Pro smartphone, which until recently was the best Android phone that could be bought for a reasonable price, was completely absent from the American market, forcing American consumers to order a smartphone from abroad. This situation could push prices even higher for US users, who may have to pay high customs and taxes.

When will the Huawei Mate X be available?

Huawei has announced that the Mate X will go on sale in the middle of the year. Unfortunately, this post was not more specific. To clarify, you just need to wait and see what will be the official release date of Huawei Mate X on sale.

Planning to buy a new premium phone? There are reasons why it's better to wait before buying a premium phone right now. What kind? Here are some of the top reasons. From premium phones in 2019, the buyer can expect: a new Qualcomm Snapdragon 855 mobile chip, a new super fast 5G connectivity, a foldable screen design, and a 48MP mobile camera.

All about phones and buying them: If you are planning to buy a new premium phone, wait at least one month with the purchase. And that's why:

It is expected that at the exhibition Mobile world Congress 2019 (also referred to as MWC 2019), which will take place in just a couple of weeks (on the twenties of February), most of the leading smartphone manufacturers will present their latest flagship phones with advanced features and updated specifications.


So, the new characteristics of cell phones for this year.

Samsung will release multimedia Galaxy phone S10, while manufacturer HMD Global will present a five-chamber Nokia phone 9 PureView. Phone makers Huawei, Oppo and LG will also showcase their latest mobile devices at the upcoming mobile show.

But, in 2019, buyers should be thinking about more than just another model refresh cycle when buying a new premium phone. And the reasons for this are the unique technical characteristics in the description of the phones.

- Qualcomm Snapdragon 855 processor.

A top-end Qualcomm processor runs on most premium phones, from Samsung models Galaxy S9 to OnePlus 6T. Snapdragon processor 845 is now history. The latest Qualocmm Snapdragon 855 chipset, based on 7nm process technology, offers better performance, better battery efficiency and built-in artificial intelligence (aka AI) processing.

Combined with the Snapdragon X50 modem, the Snapdraon 855 processor will also bring 5G mobile connectivity to premium smartphones in 2019.

Other major features of the chipset include improved gaming performance ( graphics processor Adreno 640), artificial Intelligence and a higher-resolution camera, as well as an in-screen fingerprint sensor.

- 48 megapixel camera.

The latest premium smartphones are expected to come with a higher resolution camera. The 48MP camera is the new rage, and already several phones like the Honor View20 and Redmi Note 7 have a similar feature.

While resolution is definitely not the best measurement for evaluating a camera, the built-in sensors are also vastly improved. Most of these 48MP camera phones are likely to use the Sony IMX586 sensor, dubbed the highest resolution camera sensor for mobile phones.

Besides better camera resolution and sensors, 2019 premium mobile phones can also come with Samsung-like quad and penta camera settings (five). Most phones in 2018 dual cameras had a main camera, while the secondary camera ranged from ultra-wide, deep to monochrome.

The new phones are expected to be equipped with most of these sensors with three, four or five cameras.

- Fifth generation mobile communications: 5G.

The evolution of mobile networks continues! The upcoming MWC 2019 will also be the launch pad for 5G phones. Xiaomi, OnePlus, Samsung and almost all the leading players in the mobile market are expected to present their new 5G phones. Most of these phones will also hit the European and US markets later this year. Some Apple fans already want to buy the iPhone 5G. For other countries, the rollout of 5G networks may be delayed by at least one year. But investing in a 5G phone right now won't be a bad idea.

- Foldable mobile phone.

Foldable phones are no longer a concept, screen folding is already part of the characteristics of mobile phones. Korean company Samsung unveiled its first foldable phone late last year. She is expected to unveil a commercial version of the phone at her event on February 20, ahead of the MWC 2019 mobile show.

It's likely that Samsung is betting heavily on the new form factor as it plans to launch at least one million foldable phones this year. Given that Russia is one of the priority markets, you can expect foldable phones to be released as well. Besides Samsung, Huawei, Xiaomi and Oppo have plans to release foldable phones this year.

- Artificial intelligence in phones, plus don't forget about machine learning.

Google last year introduced an operating Android system 9 Pie. Android Pie features such as responsive display and adaptive brightness are at the core of machine learning to help improve the user experience of Android phones. Going forward, artificial intelligence and machine learning will become an important part of updates for the Google Android platform. It might be worth making sure that your new phone is compatible not only with Android 9 Pie, but also with the Android Q receiver.

Beyond Google, phone companies like Xiaomi and Asus are embedding artificial intelligence (AI) and machine learning (ML) right into system apps. The camera on premium phones, for example, uses AI and ML to automatically recognize scenes and automatically optimize settings. Most mobile phones in 2019 will be equipped with cameras with advanced artificial intelligence functions.

The only thing that remains a dream when buying is when the best mobile phones will have a full-fledged 3D phone feature.

News added:

1) Samsung has released the latest version of the Galaxy S10, and people believe the iPhone may give up its position as the king of smartphones.

The latest flagship smartphone Samsung Galaxy S10 was launched by the company on February 20. Samsung introduced a lot of new products on this day. The audience was really interested in the demonstrated new phone. So much so that they say the Apple iPhone has a serious alternative. With the latest Galaxy S10, Samsung has surprised and shocked fans, in a good way.

2) Attractive, powerful and incredibly expensive 5G foldable phone Huawei Mate X.

Following the announcement of the first foldable smartphone Samsung Galaxy Fold, the Chinese company Huawei is betting on the foldable screen form factor and announces the release of the Huawei Mate X, which still works with 5G connectivity. The developer Huawei takes a completely different approach compared to Samsung, namely, placing the folding smartphone display on the outside, and not on the inside, and this solution has a number of pros and cons when describing the new generation phones. The Huawei Mate X starts at 2299 Euros.

3) Will the Apple iPhone be foldable?

Some analysts believe a foldable iPhone may be in development at the Cupertino-based company. Then, if Apple's new smartphone comes with a foldable screen, it has a chance to become the best among the already released foldable smartphones Samsung Galaxy Fold and Huawei Mate X.

Moom from the developers of Many Tricks has been bringing order to the chaos since 2011, making managing windows in the operating system as easy as clicking a mouse or using a keyboard shortcut. With Moom, you can easily move and scale windows to half the screen, quarter of the screen, or fill the screen; set custom sizes and locations, and save layouts open windows for one-click positioning. Once you've tried Moom, you'll be amazed at how you've used your Mac before without it.

Software Review: Moom is a program for moving and resizing windows in the Mac OS system.

So, Moom allows you to move and scale windows - using your mouse or keyboard - in predefined locations and sizes, or in full screen mode. When using the program with your mouse, all you have to do is hover over the green resize button and the Moom interface will appear. When you use the keyboard, click on the shortcut you defined and the Moom keyboard frame will appear, then you can move the windows with the arrow keys and modifier keys.


Moom can be run as a traditional app, as a menu bar app, or as a completely faceless background app.

The location of the pop-ups.

Hover your mouse over the green button of any window and the Moom Palette pop-up appears.

Quickly fill the screen or move and resize vertically or horizontally at the edges of the screen. Want quarter-size windows instead? Option-holding down the palette presents the four quarter-size corner options, along with "center unresized".

Resizing is not a problem.

It's actually drag and drop using Moom's unique on-screen resizing grid.

Click in an empty box below the pop-up palette, move your mouse to where you want to position the window, then click and drag its new dimensions.

Let go of the mouse button and the window will fill in the path you drew on the screen, it's not difficult at all.

Want to quickly move and scale windows in specific areas of the screen? Just turn on Moom's snap edges and corners.

Take a window, drag it to an edge or corner, and release the mouse button. You can set the resize action for each location in the Moom settings.

Set the window set to the size and location you want, then save the layout. Restore the layout using the assigned hotkey or through the Moom menu.

This feature is especially useful if you are using a laptop with an external display, Moom can launch saved layouts when displays are added or removed.

No mouse required.

Don't worry, keyboard users. Moom isn't just for those who prefer to use a mouse. Turn on keyboard controls and you can move, resize, center, use the on-screen grid, and more - all without touching your mouse.

Also, every custom Moom command, keep reading, can be assigned a global keyboard shortcut, or one that only works when the keyboard controller is on screen.

Countless custom commands.

Create and save frequently used Moom actions in a custom command menu, with additional delimiters and labels.

Moving, scaling, resizing, centering, even moving to other displays can all be done with custom commands. You can even create a series of commands tied to a single shortcut, simplifying complex move and resize operations.

But wait, that's not all about moving and resizing windows on Mac OS using Moom.

Use Moom as a regular Dock-based app, as an icon in the menu bar, or as a completely invisible background app.

Custom commands are accessed through the Moom menu bar icon, the green button pop-up palette, or keyboard shortcuts.

Use a small hexagonal grid to resize the grid instead of the full screen virtual grid.

Move windows across displays, and use related commands to scale them to new sizes and locations as you move.

You can display a keyboard cheat sheet that shows you what tasks you've assigned to which keys in keyboard mode.

Resizing windows to exact sizes, ideal for testing how well windows fit into windows of different sizes.

The Moom developers have made an effort to achieve these goals, where great software must do its job efficiently, have a clean interface, and be pleasant to use.

Summary:

Moom is a Mac OS application developed by Many Tricks that allows you to quickly arrange, resize, move, scale, and shape windows so that you spend as little time as possible placing windows and more time working with them.

System requirements for Moom:

The program requires macOS 10.8 "Mountain Lion" or later to be installed on your computer. You can try Moom for free.

Trying to download and choose the best file manager for Windows? There is good news, this is a portable program XYplorer, it is just a file manager for Windows and has such features as tabbed browsing, powerful file search (like explorer, alternative), universal preview, customizable interface, optional dual pane and large set unique ways to efficiently automate frequently repetitive tasks. This file manager for Windows computers, according to the developer Cologne Code Company, is fast, innovative, lightweight and portable. Read on for an overview of the XYplorer program!

What is a file manager for Windows today.

Learn more about the functionality of the XYplorer file manager. So, there is an export of extended information about files of entire directories (or even directory trees) into files in CSV text format. Automatic column width adjustment. Customizable display formats for file size and date information. The (real) disk space used is immediately displayed for each file and folder. Remembers the last folder and sort order. Browser-like history functionality. Favorite folders can be assigned. A large set of useful commands added to the standard file context menu, including Copy To, Move To, Copy Filename With Path, Copy File Properties, Rename Multiple Files. Extract icons, multi-file timestamp and attributed. Instant display complete information file / version for each selected file. Instant preview of images, audio and video files (display detailed media information). Instantly view file contents for all files (ASCII and binary), including extracting text from binaries (fast enough). Full support for Drag and Drop and mouse wheel.


XYplorer what is it for the user

XYplorer as a two-pane file manager for Windows was built for heavy work. The program is easy to install and easy to uninstall. Installing and running the program does not change your system or registry. Ease of use in that you can get started in no time (the interface is fully compliant with file manager standards). The program is small, fast and convenient for the computer's RAM.

Portability:

XYplorer is a portable file manager. That is, it does not require any installation in the computer operating system, stores all configuration data in the program data folder, and its launch does not change your system or registry. Take it with you and you can run the program from a USB flash drive. Then file management is in your hands.

Working with tabs:

The tabs in the file manager make it easy to switch between folders. Drag and drop them, hide them, lock them, name them or put files on them. Tabs remember their configuration individually and by session. In addition, the user gets tabs and a double pane.

Functionality:

XYplorer was designed to make the user experience faster, according to the developer. Indeed, numerous usability improvements to an attractive interface help streamline your workflow and make it more efficient. Under these conditions, you can save a lot of time when working with files in Windows.

File manager scripts for many tasks:

Yes, you can program this program. Individual solutions for individual tasks. No plugins required, scripts run from the program folder. Even beginners can benefit from this feature as many ready-to-use scripts are available on the official file manager forum.

The speed of the program:

Speed ​​has always been the main goal of XYplorer software development. The code is constantly optimized for performance, zero tolerance for slowness. In addition, the file manager uses very little RAM in Windows, the executable file is small (only 7 MB) and is loaded on the system almost instantly.

Reliability:

Can I trust the XYplorer file manager. One thing is clear that the program works as intended by the developer and is expected to work, it seems that it is very difficult to put it into a state of failure. In addition, the developer states that any problems with the program are immediately resolved and usually resolved within a few hours. It is worth adding that a large community is closely monitoring the development of the file manager and is constantly testing the frequently released beta versions.

Customizable software:

You can customize the file manager to look and behave the way you want it to. Customization ranges from fonts and colors to customizable toolbar buttons and even file icons and program associations. And every part of the XYplorer file manager is completely portable. Even dark mode.

Responsiveness of the XYplorer program developer:

System requirements for the program:

Since XYplorer is a portable file manager. File management does not require installing or modifying your operating system or registry. You can take the program with you and just launch the file manager from the USB stick along with your personal configuration.

XYplorer software works under 32-bit and 64-bit versions of Microsoft operating systems:

Windows Server 2003;
- Windows XP;
- Windows Vista;
- Windows Server 2008;
- Windows 7;
- Windows Server 2012;
- Windows 8;
- Windows 8.1;
- Windows Server 2016;
- Windows 10.

You can try the file manager for free, but remember that the demo version of XYplorer is fully functional only for 30 days after installation on your computer!

Fast Internet Video Downloader for Mac: Downie will save video content one time or according to a list and a customizable "alarm".

Internet Video Downloader - Downie is currently supported by over 1,000 different sites (including Facebook, Vimeo, Legendary YouTube, Lynda, Youku, Daily Haha, MTV, iView, South Park Studios, Bloomberg, Kickstarter, NBC News, CollegeHumor , MetaCafe, as well as Bilibili and other video sites). Plus, the list of sites from which the program can download videos is growing rapidly.


Downie features:

Support for downloading 4K YouTube videos - Unlike many other YouTube video downloader programs, Downie supports HD YouTube videos up to 4K.

Frequent updates - no need to wait long for new sites to be added from where you can download videos or fix bugs. Downie is updated approximately once a week with new features, supported sites, and more.

International approach - Downie downloader supports not only specific sites created for a specific country, the program is also localized into different languages. If your language is not in the list of supported languages, just contact the developer Charlie Monroe Software and discuss this issue.

New Features in Downie:

Redesign of the user interface of the program - user interface the bootloader has been redesigned from scratch. According to the developer's statement, the interface has become faster, more convenient and visually pleasing.

Menu bar icon - you can manage downloads from the menu bar, without having to be distracted from the current work.

Improved HLS support - As stated by the program developer, HLS streams load four times faster.

DASH support - DASH streams are now supported.

Major Post-Processing Improvements - Some uploads may only take a few seconds to post-process instead of minutes thanks to Downie, a shortcut to analyzing video before converting it.

Simple Mode - If you prefer to keep the user interface as simple as possible, there is a Simple Mode for you.

Grouping video files by the site from which they downloaded and the playlist - all downloads can now be sorted by folders depending on where you downloaded them from or from which playlist they are.

Delayed queue start is a function of scheduling downloads for the required time (for example, you can schedule a video download for the middle of the night) so as not to overload the Internet channel for the whole family.

Support for user-controlled pop-ups - the program now additionally supports pop-ups, so you can enter sites that open the login in a separate window.

Simple tips for using Downie:

If you have a large list of links or a lot of links within some text, just drag and drop it all onto Downie - the downloader will scan the text for links with video content.

You can also use copy and paste - just press Command-O in Downie and you can paste a lot of links.

Fast user support:

The developer of the video downloader responds to emails usually within 24 hours and quite often adds support for the requested sites to the program in the next update.

A few words from the developer of the program:

Charlie Monroe, CEO, Developer & User Support:

"My goal is to deliver the best apps and provide the best possible support."

Downie Compatibility:

Anyone who has thought about what to download Downie program for Mac. You should be aware that to work with the program you need a computer with an operating macOS system 10.11 or newer.

Breaking news of the software: VideoSolo DVD Creator for converting and recording video, with wide functionality for the user.

So, with the help of VideoSolo DVD Creator, burn almost any video to DVD and even Blu-ray discs easily and quickly, with excellent flexibility of settings (you can burn video, edit video, add audio, edit DVD menu).


It is possible to download online videos to burn DVD or Blu-ray discs.

You need to solve the problem of how to download videos from sites online? For example, from sites like YouTube, Facebook, MTV, Vimeo, Yahoo, Dailymotion, TED, Vevo, Niconico, AOL, Worldstar Hip Hop, Youku, CBS, ESPN and others. With this program, home movies or videos, after downloading from an online site, can still be burned to DVD or Blu-ray.

The program allows, in several simple steps, download 3D videos, high definition videos (4K, 1080p and 720p resolutions) and music for any player.

Styling your DVD with a suitable menu.

The flexible VideoSolo DVD Creator offers a variety of incredible templates to edit your DVD menu for you. Already available design themes such as holiday, family, wedding and more. After choosing the menu template you like, you can edit the text of the DVD menu and define its font, size, color. DVD menu creation is quite convenient.

What's more, you can separately set background music, background picture and opening movie with your music, picture and video file.

Setting up DVD subtitles and audio tracks.

Need to modify or create subtitles or audio tracks on your DVD? DVD Creator allows user to customize subtitle and audio track. That is, you can add subtitles and audio tracks to your DVD manually. Supported subtitle file formats SSA, SRT, and ASS.

For audio files, this program supports almost all popular audio formats, so it is easy to import them into the program. With DVD Creator, you can edit audio volume and adjust subtitle position to get a personalized DVD file.

Video editing and live preview.

This DVD burning tool is designed with powerful video editing function that allows professionals and beginners to create professional looking DVDs. Which allows you to adjust video effects such as brightness, saturation, hue, volume and contrast.

VideoSolo DVD Creator also supports the ability to crop video length, cut video, change aspect ratio, set position and transparency, and add watermark from text or image to video.

The user of the DVD Creator software can watch the DVD video at a convenient moment before burning to make sure everything is created as it should.

Video review of VideoSolo DVD Creator: User's Guide.

Created ARPANET.

And the ARPANET was formless and empty.

And the spirit of ARPA hovered over the net.

And said ARPA, "Let there be protocol,"

And the protocol became.

And I saw ARPA that it was good.

Danny Cohen

As they say, in every joke there is only a grain of a joke ... In my opinion, such a free use of the Bible text by the American Denis Cohen indicates not so much his lack of reverence for the Bible, as his desire to raise the fact of the birth of the Internet to the level of divine manifestation. Compare the creation of the world with the birth of another world - the world of the Internet, in which we spend more and more of our time ...

Leaving the topic of escapism - leaving the world of the real to the world of the Internet - to psychologists and philosophers, let us recall the stages in the development of technology that led to such a global phenomenon as the modern Internet. A historical excursion will help us to better understand the structure of the Web, the technological principles of its organization and to trace which scientific teams and organizations we owe primarily to the formation of such an important phenomenon of modern computer culture as the Internet.

When I turned to the study of various literature on the history of the Internet, I was surprised to find that many authors give a variety of dates for the birth of the Web. Some believe that the beginning of the Internet was laid back in 1962, others trace its history since 1969, others call the date of birth 1983, the fourth - 1986, and at the same time, each quite convincingly justifies his point of view. One cannot but agree that each of these dates is marked by important events in the development of the Internet. I got the impression that by tracing all these dates of birth described in the literature, it is just possible to get an idea not only of history, but also of the essence of such a phenomenon as the Internet. I hope that after reading the further story, the reader will agree with me.

Sixties - the birth of ARPA and ARPANET

So, the very first date from which the history of the Internet begins is 1962. On the one hand, this statement seems very bold: after all, in 1962, no one knew what the Internet was, and until the moment when this word appeared, it was still quite a long way away.

In those distant times, there were no more than 10,000 primitive computers in the world, which were not as easy to work on as they are now: computers were much less "friendly" and at the same time cost more than one hundred thousand dollars. The monopoly on telephone communications belonged to AT&T.

However, it was in that distant 1962 that the Advanced Research Projects Agency of the U.S. Department of Defense (ARPA) opened a project that was later called ARPANET and much later the Internet.

In 1962, important research began at a number of US educational institutions, most notably at the Massachusetts Institute of Technology (MIT). It was in 1962 that a young American scientist from MIT J.S. Lickleader wrote a paper where he expressed the idea of ​​a global network that would provide every inhabitant of the earth with access to data and programs from anywhere in the world. In October of the same year, Lickleader became the first head of the ARPA IPTO (ARPA Information Processing Techniques Office) department. At the same time (also at MIT), another scientist, Leonard Kleinrock, completed his Ph.D. thesis in communication network theory and received an assistant position at the University of California, UCLA. In the same year, a young up-and-coming MIT employee (also a future ARPANET contributor) Ivan Sutherland, using the TX-2 machine, created the pioneering interactive graphics program Sketchpad (Notepad), which had a great influence on the development computer graphics... Soon, these scientists were destined to meet while working on a research project at ARPA. In 1963, Lickleader invites Ivan Sutherland to work on the ARPA project, and two years later, another scientist who later made a great contribution to the creation of the Internet, Bob Taylor, joined the group. Lickleader signed contracts with MIT, UCLA, and BBN (a small consulting firm, Bolt Beranek & Newman) to begin realizing his then-daring ideas. In 1963, an important event occurs: the first universal ASCII standard appears - a coding scheme that assigns numerical values-codes to letters, numbers, punctuation marks and some other characters, as a result of which it becomes possible to exchange information between computers from different manufacturers.

In 1964, at almost the same time, MIT, RAND Corporation and the Great Britain National Physical Laboratory (GBNPL) launched work on the reliable transmission of information. The idea of ​​packet switching appeared, the essence of which boiled down to the fact that any information transmitted over the network is divided into several parts (packets), which then independently of each other move along different paths (routes) until they reach the addressee. Paul Baran, Donald Davis, Leonard Kleinrock conducted research in this area in parallel. Paul Baran was one of the first to publish his research in the article "Data transmission in networks". Somewhat later, Kleinrock's dissertation appeared, in which similar ideas were expressed. Networking ideas evolve against the backdrop of the ever-improving hardware platform of computers. In 1964, IBM released the new machine, the IBM 360, which established the de facto world standard for the byte — the eight-bit word — automatically making machines that used 12- and 36-bit words obsolete. IBM invested $ 5 billion in this development. In the same year, the IBM online booking system, which was named SABER (Semi-Automatic Business Research Environment), debuted. It connected 2,000 terminals in sixty cities via telephone lines.

In 1964, Lickleader leaves ARPA to return to MIT, and in collaboration with Ivan begins the development of a time-sharing operating system. Computers are gradually becoming smaller and more widespread. In 1965, DEC announced the PDP-8 that could fit on a desktop. It cost $ 18,000, a fifth of the cost of an IBM / 360. The combination of processing power, size, and cost has allowed the computer to take place in hundreds of factories, thousands of offices and research labs. In the same year, based on ARPA funding, Larry Roberts and Thomas Marill create the first Wide-Area Network (WAN). They connected TX-2 (MIT) to Q-32 in Santa Monica via a dedicated telephone line. The system confirmed the assumptions of Kleinrock, who predicted that packet switching was the most promising model for communication between computers.

A year later, Ivan Sutherland invites Bob Taylor, formerly of NASA, to continue the networking work. In the same year, ARPA-financed the JOSS (Johnniac Open Shop System) project, which is being developed at the RAND Corporation. The JOSS system provided users with online computing resources from remote terminals. A modified electric typewriter (model IBM 868) was used as consoles.

In 1966, Taylor succeeded Sutherland as director of ARPA IPTO. In his office at IPTO, there were three terminals that he could alternately connect to different computing computers via telephone wires... "Why don't we all speak at the same time?" Taylor once wondered. This question of the scientist determined a whole scientific direction, which was soon posed to the researchers of ARPA. The idea seemed so promising to Taylor that he soon managed to arrange a meeting with Charles Herzfeld, who was head of ARPA at the time. Having outlined the essence of the problem and the prospects that the study promised, Taylor, after 20 minutes of conversation, received an agreement to allocate a million dollars for the development of the project, the essence of which was to connect all ARPA IPTO clients into one network. Shortly thereafter, Taylor persuaded Larry Roberts to leave MIT to continue working on the network project at ARPA.

In 1967, another event happened that played an important role in the development of networking technologies: the modem, invented in the early sixties, was significantly improved by John Van Ging of the Stanford Research Institute (Stanford Research Institute, SRI). The scientist proposed a receiver that could reliably recognize bits of information against the background of noise interference from long distance telephone lines.

In parallel, at the same time, the English author of the idea of ​​packet switching, Donald Davies, was engaged in theoretical developments at the British National Physics Laboratory. In 1967, Larry Roberts convened a scientific conference in Ann Arbor, Michigan, to which he invited the main developers of the networking project. The conference was of great importance - the parallel work began to unite. Donald Davis, Paul Baron and Larry Roberts learned about each other's work. The term "ARPANET" was first mentioned when Larry Roberts spoke at this particular conference. At the same conference, another prominent scientist, Wesley Clark, first proposed the idea and proposed the term "IMP" - Interface Message Processors, meaning devices for managing traffic in a network, which later evolved into modern routers.

In 1968, work began on the creation of the IMP. ARPA was awarded a $ 1 million contract with a small consultancy firm, Bolt Beranek & Newman (BBN), to create four IMPs to connect the ARPANET. BBN outpaced its larger competitors due to its simple organizational structure and lack of bureaucratic hurdles. BBN was headed by Frank Hart, a man of outstanding organizational skills, whose active work allowed the small company to receive such a prestigious contract. Despite the fact that the contract was promising, only one year was released for the creation of the IMP.

In 1969, BBN successfully fulfilled the terms of a historic contract, which resulted in the ARPANET network, which covered the entire West Coast of the United States.

Seventies - Telnet, FTP, TCP / IP, USENET

In 1970, the network continues to grow - a new node is added every month. In the same year, two more important events took place. First, Denis Ritchie and Kenneth Thompson of BelLabs completed the UNIX operating system. Secondly, in the same year, the NWG (Network Working Group), led by Steve Crocker, completed work on the NCP (Network Control Protocol) protocol, and a year later completed work on the Telnet terminal emulation protocol and made significant progress in the work on the transmission protocol FTP files.

In 1971, BBN developed a new platform. The so-called TIP devices (Terminal IMP, Terminal Interface Processor) made it possible to log into remote hosts, thus making ARPANET available to more users. 1971 was significant not only for the development of network technologies; In the same year, revolutionary changes took place in the element base of computers - the 4004 microprocessor from Intel appeared. Returning to network technologies, it should be noted that the achievements were so significant that it was time for public demonstrations. In 1971, Larry Roberts decided to organize a demonstration of the ARPA network at the International Computer Communications Conference (ICCC), which was to be held in Washington in October 1972. The experiment was to be carried out in real time to show that the network not only exists, but also works. More than 40 terminals were prepared for the demonstration. AT&T provided a data feed.

The color of the then small network elite gathered to take a look at the work of the network. Donald Davis, a scientist who coined the term "packet switching", specially flew from England. The demonstration lasted for two and a half days and was attended by hundreds of people, including engineers and technicians from the telecommunications and computer industries. The demonstration at ICCC contributed greatly to the spread of packet switching ideas and showed a wide range of people for the first time that resource sharing on the network is real. As a result, the ARPANET community has gained respect, recognition for new technology, and resources. For computer manufacturers, this meant a new market emerged.

However, the ARPANET demonstration was not the only event in 1972. At the same time, at least two more events took place that had a huge impact on the development of computer technology. In 1972, Ray Tomilson (BBN) wrote a program to send email over the ARPANET. He also introduced the designation “ [email protected]"And used the @ symbol, which later (since 1980) was enshrined in the international address standard Email... (By the way, the C language appeared in the same year.) In 1973, already 30 institutes were connected to the ARPANET. ARPANET's clients include private organizations such as BBN, Xerox PARC and MITER Corporation, as well as government organizations such as NASA's Ames Research Laboratories, National Bureau of Standards and Air Force Research Facilities.

ARPA is renamed DARPA, where the letter "D" stands for Defense. Bob Kahn moves from BBN to DARPA on a project to connect ARPANET to other networks. A very complex work begins to combine networks with different interfaces, data rates and packet sizes. In essence, this was the work of creating an Internet Protocol. In September 1973, the first publication on the new TCP (Transmittion Control Protocol) appeared. In 1974 Larry Roberts joined BBN and Lickleader joined DARPA IPTO. ARPANET's daily traffic reached 3 million packets by this time.

In 1975, the US Department of Energy creates its own research center for the development of network technologies. Since 1976, DARPA has funded research at Berkeley, whose scientists have been working on modifying UNIX and creating the TCP / IP protocol. TCP / IP has become over time one of the most popular networking protocols and the de facto standard for implementing global network connections due to its openness, scalability and by providing the same capabilities to global and local networks.

In 1976, the CRAY 1 supercomputer appeared, the computing power of which attracted researchers from different parts of the United States. Many scientists have expressed a desire for remote access to powerful computing resources supercomputer. So the question of the need to organize network access to supercomputer centers. But the development of network technologies was stimulated not only by the supercomputer direction.

1977 was announced Apple computer II, and the advent of desktops with the potential for dial-up communications has given new impetus to the networking and modem industry. In 1977, DARPA formed an International Council on Internet Issues, headed by Peter Kirsten of the University Colledge (London). By early 1978, the ARPANET experiment was nearly complete.

In 1979, the USENET service was introduced, which was one of the first examples of client-server organization.

By the late seventies, TCP / IP architecture and protocols had taken on a modern look. By this time, DARPA had become a recognized leader in the development of packet-switched networks. Further development of network technologies, including wireless radio networks and satellite channels communications, stimulated the activity of DARPA in the study of interconnection problems and the implementation of Internet principles in ARPANET.

DARPA has made no secret of its activities in the development of Internet technologies, therefore various scientific groups have shown interest in the development of technology of the global network.

The Internet takes its origin from the ARPANET network, but more often the Internet is called the successor of NSFNET - the American network of scientists of the NSF (National Science Foundation), which collaborated, merged with ARPANET, and then absorbed it.

NSFNET appeared only in the mid-eighties, but NSF showed interest in building scientific networks much earlier. In 1979, six American universities met to discuss the possibility of developing a Computer Science Research Network (CSNET). Bob Kahn attended this meeting as a consultant from DARPA, and Kent Curtis as a representative of the NSF (National Science Foundation). Then, in 1979, negotiations did not lead to an agreement: NSF considered the project too expensive. However, a year later, NSF returns to this idea, which is supported by an increasing number of universities. Ultimately, NSF agrees to host the CSNET project. The project is allocated $ 5 million, and NSF goes down in history as one of the first founders of the Internet. To make it easier for the reader to correlate these successes with other achievements in the development of computer technology, let me remind you that in the same year the young company Microsoft offered the MS-DOS operating system, and IBM began the production of the first personal computer.

Eighties - NSFNET, BBS, WWW

Many experts call the beginning of the 80s the time of the birth of the Internet. During this time, DARPA initiated the conversion of machines connected to its research networks to use the TCP / IP stack. In 1981, the IWG (Internet Working Group) at DARPA publishes a document that talks about the complete transition from the NCP (Network Control Protocol) to the TCP / IP protocol, which has been in development since 1974. ARPANET becomes the backbone of the Internet and is actively used for numerous experiments with TCP / IP.

DARPA has organized a series of scientific workshops during which scientists exchange new ideas and discuss the results of experiments. An ad hoc committee was established to coordinate and guide the development of Internet protocols and architecture, called the ICCB (Internet Control and Configuration Board); this committee existed and worked regularly until 1983.

The final transition to Internet technology took place in January 1983: this year the TCP / IP protocol was adopted by the US Department of Defense, and the ARPANET network was split into two independent parts. One of them (intended for scientific purposes) retained the name ARPANET, and the second, large-scale MILNET network went to the military department.

In order to stimulate the use of new protocols in educational institutions, DARPA made the TCP / IP implementation widely available to the university community. During this time, many researchers were using a version of the Berkeley, California Unix OS called BSD Unix (from Berkeley Software Distribution.)

Thanks to the fact that DARPA at one time subsidized BBN and the University of Berkeley in order to implement TCP / IP protocols for use with the popular Unix operating system, more than 90% of computer departments of universities have adapted the new network technology, and the BSD version became the de facto standard for implementations of the TCP / IP protocol stack. Several versions of BSD were released, each adding new capabilities to TCP / IP, including 4.2BSD (1983), 4.3BSD (1986); 4.3BSD Tahoe (1988); 4.3BSD Reno (1990); 4.4BSD (1993).

Since 1985, NSF has implemented a program to build networks around its supercomputer centers. And in 1986, the creation of a core network (56 Kbps) between NSF supercomputer centers led to the emergence of a number of regional networks such as JVNCNET, NYSERNET, SURANET, SDSCNET, BARRNET and others. This is how the NSFNET backbone appeared, which eventually united all these research centers and linked them to the ARPANET. Thus, NSFNET linked five supercomputer centers and opened access to powerful computing resources for a wide range of researchers. At one time ARPANET, due to bureaucratic problems, did not cope with this task, which led to the emergence of NSFNET. A large number of universities and research centers, including those outside the United States, have expressed their desire to connect to this network. To reduce the fee for using long-distance communication lines, it was decided to develop a system of regional networks, which unites computers within a certain region and has access to similar networks nearby. With this configuration, all computers are equal in rights and have a chain link through neighboring computers, both with each other and with NSF supercomputers. Thus, since 1986, we can talk about the formation of the global computer network Internet.

In 1988, the Internet becomes an international network - Canada, Denmark, Finland, France, Norway and Sweden join it. In the same 1988, the BBS (Bulletin Board System) service appears on the network.

In January 1989, the network had 80,000 nodes; in November, Austria, Germany, Israel, Italy, Japan, Mexico, the Netherlands, New Zealand and the United Kingdom joined the Internet - the number of nodes in the network increased to 160,000. In the same year, FDDI (Fiber Distributed Interface) technology appeared - a distributed data transmission interface through fiber-optic channels.

If the Internet is a collective invention, then the idea of ​​hypertext and the WWW is associated with the name of a specific person. In 1989, Berners-Lee came up with the idea of ​​hypertext that sparked the creation of the World Wide Web. While working as a technical consultant at the European Laboratory for Particle Physics in Geneva, Berners-Lee wrote the Eniquire program, which became the prototype for the future WWW. In the same 1989, Berners-Lee began work on the global project of the World Wide Web, and just two years later (in 1991) the first WWW objects were placed on the Internet. In the period from 1991 to 1993, the scientist is engaged in improving the specifications of the WWW. In 1994, Berners-Lee joined the Massachusetts Institute of Technology in the Computer Science Laboratory, where he serves as director of the WWW consortium, which coordinates the efforts of more than a hundred corporations to improve the technology of the World Wide Web.


Formation of ARPA - was part of the US response to the Soviet Union's 1957 satellite launch. ARPA was funded by the US Department of Defense. The task of these chosen ones is to distribute between universities and laboratories an annual budget of several billion dollars for the most important works from a national security point of view. And already in the 60s, the main work of ARPA was devoted to the development of a method for connecting computers with each other (conducting experiments in the field of computer communications), as well as:

* unification of the scientific potential of research institutions;
* study of ways to maintain stable communications in the face of a nuclear attack;
* development of the concept of distributed control of military and civilian structures during the war.

Before work on ARPANET began, the very idea of ​​a network was born from the creation of ARPA Information Processing Techniques. Spring 1967 at the University of Michigan, ARPA has held an annual meeting of "principle researchers" from each of its universities and other contractors. As a result, the results of the previous year's research were obtained, directions of promising research were indicated. Networking was one of the topics raised at this meeting.

"At the meeting, it was agreed that the work on the agreements would begin to be used to exchange messages between any pair of computers in the proposed network, and also on consideration of the kinds of communication lines and datasets that are used. In particular, it was decided that "protocol" communication would include conventions for the mark and block transmission, erroneous validation and retransmission, and computer and user identification.<...> "

However, the ARPANET transition from NCP to TCP / IP only took place on January 1, 1983. It was a Day X-style transition that required simultaneous changes on all computers. The transition was carefully planned by all stakeholders over the past several years and went surprisingly smoothly (but led to the proliferation of the “I survived the transition to TCP / IP”).

In 1983, the transfer of ARPANET from NCP to TCP / IP made it possible to divide this network into MILNET, the actual network for military needs, and ARPANET, which was used for research purposes.


In 1984, the ARPANET faced a formidable rival, the US National Science Foundation (NSF) founded the vast intercollegiate network NSFNet, which had much higher bandwidth (56 kbps) than ARPANET.

In 1990, the concepts of ARPANET, NFSNET, MILNET, etc. finally disappeared from the scene, giving way to the concept of the Internet.

So what, the readers will ask us? Nothing special, just pay attention to the year when it happened, and the fact that these two computers were the first nodes of the network, which later became known as ARPANET.

Yes, yes, the very network from which the entire Internet seems to have grown later. The same one, which, according to later mythology, was designed in the event of a nuclear war in order, they say, to provide stable communication in conditions when direct communication channels are disabled.

In fact, this is really a myth: ARPANET, although generated by the Advanced Research Projects Agency (ARPA, now DARPA), in fact, was not a purely defense project, but rather a private initiative, to the development of which ARPA attracted substantial funds.

The private (well, almost) person around whom the whole story swirled was the computer scientist J.C.R. Licklider, who worked at BBN. In August 1962, he published several papers related to the construction of what he called the Intergalactic Computer Network. It outlined almost all the basic principles by which today's Internet functions.

In October 1963, Licklider was appointed to the Pentagon as head of the behavioral sciences and command and control programs at the Advanced Research Projects Agency.

Licklider then talked for a long time with Ivan Sutherland and Bob Taylor - then they will be called the pioneers of the Internet, and for the cause - and was able to convince them of the expediency of implementing their ideas. However, ARPA Licklider managed to leave even before his concept was accepted into development.

ARPA had its own interest in the project of a computer network that allows the use of various computers for transmitting messages: the Agency sponsored research in various commercial and academic institutions (including in the field of computer science), and was interested in these researchers using computers in their work. ARPA also supplied them.

In addition, such a network could accelerate the dissemination of information about new research results and new software.

As Charles Herzfeld, the former head of ARPA, later said, the ARPAnet project was the result of their "frustration over the limited number of large and powerful research computers in the country, and the fact that many researchers who needed access to them could not get it due to geographic distance ". Another word against the common idea that ARPAnet was created "in case of nuclear war."

However, given that the main profile of ARPA / DARPA is precisely military technology, and the Cold War was in full swing, the military assignment of ARPAnet will still be attributed for a very long time - and it is hardly entirely unreasonable.

Taylor had three computer terminals in his office, each connected to a different ARPA-funded computer. The first was the Q-32 system at System Development Corporation, the second was Project Genie at the University of California at Berkeley, and the third was computer system Multics at MIT. Each terminal had its own command system, each had to be logged in, as it is now called, separately ...

Laziness, as you know, is the engine of progress, and Taylor came to the logical conclusion that it would be nice to make it so that from one terminal it was possible to establish a connection with any other computer.

By the way, at practically the same time, there were active developments in the field of packet routing; the first public demonstration took place on August 5, 1968 in Great Britain, at the National Physics Laboratory.

By mid-1968, Taylor had prepared a complete plan for the computer network, and after ARPA approval, 140 potential contractors had sent out the necessary requests.

And here it was discovered that no one needed all this at all. The overwhelming majority considered the ARPA proposal insane, only 12 agencies responded on the merits, and only four of them were then considered by ARPA as primary contractors. By the end of 1968, only two remained, and as a result, the contract went to the above-mentioned BBN Technologies company.

A team of seven specialists was quite quickly able to design the first working machines: on the basis of the Honeywell DDP 516 computer, the first IMPs (Interface Message Processors) were produced, devices resembling modern routers.

True, not in size:

Each IMP received and forwarded data packets, and was connected to a modem connected to leased lines. The host computer was already connected to the IMP itself (via a special serial interface).

A workable system with all the hardware and software stuffing was built in nine months. A symbolic term, isn't it?

And so on October 29, the first attempt was made to exchange messages between two computers. The first hello came out crumpled: only the letters L and O were transmitted from the word LOGIN (by the way, now "lo" is the abbreviated "Hello"), after which the system fainted. After a few hours, she was brought to her senses, and the word LOGIN reached the Stanford car ...

This is how ARPAnet began.

By the beginning of December 1969, the ARPAnet consisted of four nodes, by September 1971 there were already 18 nodes, and the growth went exponentially. In 1973, the ARPAnet was "publicly presented". In October, on the First international conference on Computers and Communications in Washington, DC, ARPA staff demonstrated the operation of the system by linking computers located in 40 different locations across the United States. This attracted considerable interest, and in addition to ARPAnet, new networks began to appear, built on similar principles.

Perhaps most significant later was the development by ARPA and Stanford of a transmission control protocol / internet protocol (TCP / IP). It is this protocol stack that is at the heart of the modern Internet and still lies.

ARPAnet formally ceased to exist in 1990. On the other hand, the entire Internet today rests on its basic principles, so to some extent ARPAnet turned out to be immortal.



Plan:

    Introduction
  • 1. History
  • 2 Objectives of the ARPANET project
  • 3 Legacy
  • Notes (edit)

Introduction

ARPANET Logic Card, March 1977

ARPANET(from the English. A dvanced R esearch P rojects A gency Net work ) - a computer network created in 1969 in the United States by the US Department of Defense Advanced Research Projects Agency (ARPA) and was the prototype of the Internet. It was the first network in the world to switch to data packet routing (January 1, 1983). ARPANET ceased to exist in June 1990.


1. History

In 1969, the US Department of Defense decided that in case of war, America needed a reliable communication system. The Advanced Research Projects Agency (ARPA) has proposed developing a computer network for this. The development of such a network has been entrusted to the University of California, Los Angeles, Stanford Research Center, University of Utah, and California State University Santa Barbara.

The first test of the technology took place on October 29, 1969 at 21:00. The network consisted of two terminals, the first of which was located at the University of California, and the second at a distance of 600 km from it at Stanford University. The test task consisted in the fact that the first operator entered the word "LOG", and the second had to confirm that he sees it on his screen. The first experiment was unsuccessful, only the letters "L" and "O" were displayed. An hour later, the experiment was repeated and everything went well.

The computer network was named ARPANET, within the framework of the project, the network united the four specified scientific institutions, all work was funded by the US Department of Defense. Then the ARPANET network began to actively grow and develop, scientists from various fields of science began to use it. In 1973, the first foreign organizations from Great Britain and Norway were connected to the network, and the network became international. The cost of forwarding an email over the ARPANET was 50 cents. In 1984, ARPANET faced a formidable rival, the US National Science Foundation (NSF) founded the vast intercollegiate network NSFNet, which had a much higher bandwidth (56 kbps) than ARPANET. In 1990, the ARPANET ceased to exist, completely losing the competition to NSFNet.


2. Objectives of the ARPANET project

  • conducting experiments in the field of computer communications;
  • combining the scientific potential of research institutions;
  • exploring ways to maintain stable communications in the face of a nuclear attack;
  • development of the concept of distributed control of military and civilian structures during the war.

3. Legacy

Many existing Internet protocols have their origins in the ARPANET. For example, the reverse DNS lookup protocol still uses the top-level domain name ".arpa": to find records for the IP address 1.2.3.4, you must query for the address 4.3.2.1.in-addr.arpa.



Did you like the article? Share it