deepb, ok then, a big text segments, picking the most appropriate ones according to the page topic, inserting a few, inserting a link, inserting a few more... Makes sense... In the back-end, it could also harvest some more text from the page, and add it to its database.
Great algorithm.
(p.s. I was being sarcastic when I said human... What was posted was indistinguishable from a human by any kind of algorithm, this could easily pass gmail's spam filters).
deepb, ok then, a big text segments, picking the most appropriate ones according to the page topic, inserting a few, inserting a link, inserting a few more... Makes sense... In the back-end, it could also harvest some more text from the page, and add it to its database.
Great algorithm.
(p.s. I was being sarcastic when I said human... What was posted was indistinguishable from a human by any kind of algorithm, this could easily pass gmail's spam filters).